Required Skills

HDFS MapReduce Yarn HIVE sqoop Impala spark flume

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 9th Oct 2023

JOB DETAIL

  • Need 9+ years’ experience, ETL, Informatica, Sql, Terradata….
  • 9+years’ experience in Hadoop stack and storage technologies, HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
  • Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
  • Excellent analytical capabilities - Strong interest in algorithms
  • Experienced in HBase, RDBMS, SQL, ETL and data analysis
  • Experience in No SQL Technologies (ex., Cassandra/ MongoDB, etc )
  • Experienced in scripting(Unix/Linux) and scheduling (Autosys)
  • Experience with team delivery/release processes and cadence pertaining to code deployment and release

Company Information