Must Have Skills: Must have 10+ years of experience, Python, Snowflake, Airflow, AWS and Data Warehousing/ETL
- Hands-On experience in Snowflake, Airflow orchestration, Java script, Python, Spark SQL , Lambda
- Must have 10+ years of experience in delivering data engineering projects
- Have strong knowledge in ETL, Data warehousing, Business intelligence
- Should be able to configure pipelines to ingest data from data sources to the data platform. This will include configuration of airflow ingestion pipelines and/or Snowflake external tables/ snowpipe.
- Should be able to configure pipelines to ingest and process data from data sources to the data platform. This will include configuration of airflow ingestion pipeline and Snowflake
- Monitor and respond to scheduled workloads that feed data from and to the data platform.
- Closely follow the run schedules and changes to schedules during maintenance windows to make sure workloads are executed after the maintenance window is complete.
- Communicate and work with Suppliers on schema drift and/or other Supplier issues.
- Communicate to stakeholders, delays or impacts of failures and escalate as needed to Leadership, EOC.
- Create and execute quality scripts to monitor and maintain the accuracy of our data.