Minimum eight years of relevant experience as a data architect building large-scale data solutions.
Experience in Migration of existing On-Prem Data warehouses to Snowflake.
Experience in architecting and large data modernization, data migration, data warehousing – experience with cloud-based data platforms (like Snowflake)
Should have experience in architecture and implementing End to End Modern Data Solutions using AWS, Redshift, S3
Good appreciation and at least one implementation experience on processing substrates in Data Engineering – such as AWS Glue, ETL Tools, ELT techniques
Experience with defining and operationalizing data strategy, data governance, data lineage and quality standards.
Experience designing highly scalable ETL processes with complex data transformations, data formats including error handling and monitoring .
Extensive knowledge of data engineering, data integration and data management concepts (i.e. APIs, ETL, MDM, CRUD, Pub/Sub, etc.)
Experience with data Modeling.
Experience with structured and hierarchical datasets (i.e. JSON, XML, etc.)
Engineering experience with large scale system integration and analytics projects.