Required Skills

Talend Big Data Edition AWS

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Jul 2022

JOB DETAIL

  • Development knowledge in Spark, PySpark, AWS Lambda, Python
  • Experience in designing and developing ETL jobs, data transformation using SQL
  • Working experience with Database technologies – SQL Server, Oracle, MySQL, PostgreSQL, MongoDB
  • Experience in Data Engineering or working on Enterprise Data Warehouses & Business Intelligence environments
  • Understanding of data modeling, data access, data storage techniques, data structures and algorithms
  • Experience with at least one ETL tool like Talend, Informatica, etc.
  • 5+ Yrs of Strong experience in data warehousing using ETL Talend Integration tool - Talend Big Data Platform, Informatica, Abinitio
  • Looking for Resources that can support the data pipeline design, development and Implementation of their enterprise data products using Talend (We will consider resources with other ETL tools experience).
  • Standard Jobs and Big Data / Hadoop Jobs using Talend.
  • Experience on AWS services like EMR, Redshift, RDS, Lamba, S3 Etc.
  • Experience with Talend DQ and Profiling
  • CI/CD Pipeline setup using Bamboo.
  • Must Have Skillset: Talend DI, Talend Big Data Edition, AWS(S3, EMR, Redshift), Spark

Company Information