Required Skills

Data Warehousing/ETL Python Snowflake Airflow AWS Java script Python Spark SQL Lambda

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Feb 2021

JOB DETAIL

Must Have SkillsMust have 10+ years of experience, Python, Snowflake, Airflow, AWS and Data Warehousing/ETL

  • Hands-On experience in Snowflake, Airflow orchestration, Java script, Python, Spark SQL , Lambda
  • Must have 10+ years of experience in delivering data engineering projects
  • Have strong knowledge in ETL, Data warehousing, Business intelligence
  • Should be able to configure pipelines to ingest data from data sources to the data platform. This will include configuration of airflow ingestion pipelines and/or Snowflake external tables/ snowpipe.
  • Should be able to configure pipelines to ingest and process data from data sources to the data platform. This will include configuration of airflow ingestion pipeline and Snowflake
  • Monitor and respond to scheduled workloads that feed data from and to the data platform.
  • Closely follow the run schedules and changes to schedules during maintenance windows to make sure workloads are executed after the maintenance window is complete.
  • Communicate and work with Suppliers on schema drift and/or other Supplier issues.  
  • Communicate to stakeholders, delays or impacts of failures and escalate as needed to Leadership, EOC. 
  • Create and execute quality scripts to monitor and maintain the accuracy of our data.

Company Information