Required Skills

Computer Science Software Engineering Information Technology HDFS Hive Spark Sqoop Kafka NiFi Python PySpark Java BDM StreamSets

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

Employment Type

  • Permanent Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 25th Nov 2020

JOB DETAIL

• Bachelor's degree in Computer Science, Software Engineering, Information Technology, or related field required

• 10-12 years of overall experience in architecting and building large scale, distributed big data solutions.

• Experience in at least 2-3 Big Data implementation projects.

• Solid experience in Hadoop Ecosystem development including HDFS, Hive, Spark, Sqoop, Kafka, NiFi, and real time streaming technologies and host of big data open source stack.

• Working experience in Cloudera distribution is preferred.

• Must have Experience in Python, PySpark, Java. 

• Must possess excellent communication skills.

• Strong analytical, technical and trouble-shooting skills.

• Experience leading teams and/or managing workloads for team members.

• Nice to have working experience in Informatica BDM, StreamSets

Company Information