Required Skills

ETL work SQL Python Java

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 31st Jan 2024

JOB DETAIL

  •  
  • Minimum of Ten (10) years’ experience in the fields of data warehousing, ETL and application development.
  • Minimum of 5+ years of work experience on the Hadoop based Data Lake solutions.
  • Minimum of 3+ years of experience on cloud-native and cloud-agnostic Data Lake/Data warehousing solutions preferably cloud data platform, Snowflake and AWS
  • Strong experience in design and development of Snowflake data pipelines using Snowpark, Streamlit and SwowCLI hosted on AWS cloud.
  • Hands on experience with AWS services like EC2, S3, Lambda, Step functions, Glue, Kafka and Airflow.
  • 1+ years of experience on Data flow tools such as Apache Airflow.
  • Data Engineer experience with strong programming background using SQL, Python, Java and Linux scripting (BASH).
  • Prior experience leading ETL work, Data warehouse/Data Lake development a plus.
  • Ability to work as part of a team, self-motivation, adaptability and positive attitude.
  • Must have strong communication skills.

Company Information