Required Skills

ETL JavaScript

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th Mar 2024

JOB DETAIL

MINIMUM EDUCATION AND RELATED WORK EXPERIENCE:

  • Bachelor's degree in a technology field and 5 years of prior TI work experience; OR
  • Completion of Coding /IT Bootcamp and 5 years of prior TI work experience; OR
  • 7 years of experience in most phases of IT systems deployments in one or more of the following areas: design and deployment of cloud services, software development, hardware installation, system administration, cyber security, or other functional IT area.
  • Design, build, and maintain scalable and reliable data pipelines and ETL processes.
  • Develop and maintain data infrastructure systems, including data warehouses and data lakes.
  • Collaborate with data scientists and analysts to understand data requirements ,create mapping documents and data models.
  • Ensure data quality and integrity by implementing data validation and cleansing processes.
  • Optimize data pipelines and systems for performance and scalability.
  • Troubleshoot data pipeline and system issues and provide timely resolution.
  • Create and maintain documentation of data architecture, data pipelines, and data systems.
  • Good problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Should be able to mentor team members.
  • Subject matter expert in specific areas, ability to lead projects
  • Good understanding in writing complex SQL queries,•. Good knowledge on scripting languages like Python, Node JS , JavaScript, Java
  • Good knowledge of AWS Cloud Platform services like AWS S3, AWS Lambda, Glue , Managed Airflow, IAM, Amazon RDS, Amazon Redshift, MSK
  • Familiarity with Kafka, Apache Airflow
  • Familiarity to create infrastructure as a code using Terraform

 

Company Information