Required Skills

Big data framework hadoop(Hive impala HDFS etc) hadoop based data lake or warehouse.

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 3rd Nov 2020

JOB DETAIL

ETL developer - Bigadata

Location: New York, NY

NO OPT - min 10 years of exp.
12 year work Exp with Big data framework hadoop(Hive , impala,HDFS etc) 

Hands on exp building a hadoop based data lake or warehouse. 

Exp in traditional ETL tehnologies and hadoop sqoop and hadoop flume. 

Excellent problem solving analytical and leadership skill. 

Excellent written and verbal communication skills. 

Exp in optimizing large data loads. 

Develop detailed ETL specifications based on business requirements. 

Analyse functional specifications and assist in designing potential technical solutions. 

Identifies data source and works with source system team and data analyst to define data extraction methodologies. 

Analysis of existing designs and interfaces and apply design modifications or enhancements. 

Good knowledge in writing complex quiries in plsql. 

Required skills 

Team lead experience- Global delivery model 

Development experience in languages- Hive/impala/Python. 

Demonstrated exp in databases like hadoop terradata /Bigdata. 

Sound knowledge about cloud(Azure prepared) 

Demonstrated experience in ETL tools like informatica 

 

Company Information