Required Skills

AWS SQL Python/PySpark DevOps

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :-

Other Information

  • No of position :- ( )

  • Post :- 10th Nov 2025

JOB DETAIL

  • Design, build, and manage scalable, reliable data pipelines on AWS using services such as S3, Redshift, Glue, Lambda, EMR, etc.
  • Develop ETL processes to ingest, transform, and store large datasets from multiple data sources.
  • Implement and optimize SQL queries for data extraction, transformation, and loading (ETL).
  • Write efficient Python/PySpark code for processing and analyzing big data.
  • Collaborate with cross-functional teams to gather and understand requirements, and translate them into technical solutions.
  • Implement DevOps practices such as CI/CD pipelines, automated testing, and version control to streamline deployment and maintenance.
  • Monitor and troubleshoot data pipeline issues, ensuring system performance and data integrity.
  • Support data governance and ensure data security and compliance with best practices and organizational policies.
  • Document data processes, data flows, and architecture to ensure knowledge sharing across teams.

Qualifications:

  • Bachelor's degree in Computer Science, Data Engineering, or related field.
  • 3+ years of experience in data engineering with a focus on AWS cloud services.
  • Proficiency in SQL for database management and query optimization.
  • Strong programming skills in Python, with experience using PySpark for big data processing.
  • Experience with AWS services such as S3, Redshift, Glue, Lambda, EMR, and others.
  • Experience with DevOps practices including CI/CD pipelines, version control (Git), and infrastructure-as-code (e.g., Terraform, CloudFormation).
  • Familiarity with data modeling, data warehousing concepts, and best practices in data pipeline architecture.
  • Excellent problem-solving and troubleshooting skills.
  • Strong communication and collaboration abilities.
  • AWS certifications (e.g., AWS Certified Data Analytics, AWS Certified Solutions Architect) are a plus.

Company Information