Required Skills

AWS SQL Airflow Python PySpark

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 31st Jul 2024

JOB DETAIL

  • Design, build, and maintain our data infrastructure using AWS, SQL, Airflow, Python, PySpark, and Redshift.
  • Develop and implement data pipelines to extract, transform, and load data from various sources.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Optimize data storage and retrieval for performance and scalability.
  • Monitor and troubleshoot data pipelines to ensure data accuracy and availability.
  • Develop and maintain documentation of data infrastructure and processes.
  • Stay up-to-date with the latest technologies and trends in data engineering.


Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
  • Minimum of 5 years of experience in data engineering or related field.
  • Strong proficiency in AWS, SQL, Airflow, Python, PySpark, and Redshift, or equivalent technologies.
  • Experience with data modeling, data warehousing, and ETL processes.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Ability to work independently and as part of a team.
  • Experience with Agile development methodologies is a plus.

Company Information