Required Skills

Data Engineer AWS S3 Glue AWS RDS PySpark Python Confluent Kafka

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 7th Jan 2021

JOB DETAIL

 

Role:Data Engineer 
Location:Tempe,AZ | Santa Clara    
Duration:12+ Months
Client:TCS
Experience:8+Years
Job Description:

The candidate must have overall 8-12 years of experience in ETL and Data Warehouse, Data Lake, data quality & etc.
Minimum 2+ years of experience in implementing AWS technologies like AWS S3, Glue, AWS RDS, PySpark, Python, Confluent Kafka, etc. 
Minimum 2+ years of experience in implementing ETL technologies Informatica, BODS, SSIS etc
Expertise in scripting technologies like Python/Spark/pyspark/Linux
Batch solution (aws glue/aws pipeline). 
Distributed compute solution (Spark, EMR)
Analyze data through standard SQL (Athena)
Functional solution (aws lambda)
Distributed storage (redshift, S3)
Real-time solution (kafka, aws kinesis)
Experience / understanding of Agile, SCRUM and CI/CD.
Experience with Redshift query optimization, conversion and execution
Design and Build ETL jobs to support customer Data Lake, Enterprise data warehouse 

Company Information