Build scalable and reliable data pipelines to support data ingestions (batch/streaming) and transformation from multiple data sources using SQL, AWS, Snowflake, and data integration technologies.
Create low-level design artifacts, including architecture diagrams and mapping specifications.
Work with engineering, technology, and business stakeholders to understand data requirements
Create unit/integration tests and implement automated build and deployment.
Participate in code reviews to ensure standards and best practices.
Work to deploy, monitor, and maintain production systems.
Analyze and organize raw data for ingestion.
Build proof-of-concepts and prototypes as needed to evaluate new tools and technologies
Create and update user stories in backlog
Collaborate with product owner, data analysts, data scientists and architects
Key Requirements:
Strong ELT /ETL experience with coding complex transformations (Not just extract/load mappings) using Matillion
Minimum 1 year hands on experience working in Snowflake
Experience in Python, AWS Lambda
CI/CD Experience using Azure Dev Ops, Git repositories, Build and Release pipelines