Required Skills

Snowflake Airflow orchestration Java script Python Spark SQL Lambda ETL Data warehousing Business intelligence

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 10th Feb 2021

JOB DETAIL

1.           Hands-On experience in Snowflake, Airflow orchestration, Java script, Python, Spark SQL , Lambda

2.           Must have 10+ years of experience in delivering data engineering projects

3.           Have strong knowledge in ETL, Data warehousing, Business intelligence

4.           Should be able to configure pipelines to ingest data from data sources to the data platform. This will include configuration of airflow ingestion pipelines and/or Snowflake external tables/ snowpipe.

5.           Should be able to configure pipelines to ingest and process data from data sources to the data platform. This will include configuration of airflow ingestion pipeline and Snowflake

6.           Monitor and respond to scheduled workloads that feed data from and to the data platform.

7.           Closely follow the run schedules and changes to schedules during maintenance windows to make sure workloads are executed after the maintenance window is complete.

8.           Communicate and work with Suppliers on schema drift and/or other Supplier issues.  

9.           Communicate to stakeholders, delays or impacts of failures and escalate as needed to Leadership, EOC. 

10.         Create and execute quality scripts to monitor and maintain the accuracy of our data.

 

Company Information