Collaborate with clients and cross-functional teams to understand business requirements, design data solutions, and implement them.
Develop architecture blueprints, design specifications, and implementation plans for data solutions.
Design, build, and deploy scalable, reliable, and efficient data pipelines using Snowflake, Data Vault & Dimensional Modelling, Azure Data Factory, and Azure Databricks.
Develop and maintain data models using best practices in dimensional modeling and data vault modeling techniques.
Design and implement ETL processes and workflows to move data from various sources to Snowflake and Azure Data Lake.
Optimize and fine-tune data processing and storage systems to achieve high performance, scalability, and availability.
Define and implement data security and privacy controls to ensure compliance with industry and regulatory standards.
Collaborate with data engineers, data scientists, and other stakeholders to ensure data solutions meet business needs and provide value.
Requirements:
Bachelor's or master's degree in Computer Science, Information Systems, or a related field.
Minimum of 7-10 years of experience in designing and implementing data solutions using Snowflake, Data Vault & Dimensional Modelling, Azure Data Factory, and Azure Databricks.
Strong expertise in dimensional modeling and data vault modeling techniques.
Proven experience in building scalable and reliable ETL pipelines using Snowflake, Azure Data Factory, and Azure Databricks.
Experience with data governance ( MDM, DQ and Data catalog) , data security, and privacy compliance.
Excellent communication and collaboration skills to work effectively with clients, cross-functional teams, and stakeholders.
Ability to work independently and lead projects from inception to completion.
Should be able to various geographies team & customers and have right attitude to solve the critical and complex problems
Certification in Snowflake, Azure Data Factory, and Azure Databricks is a plus.