Required Skills

HDFS Hive Spark Sqoop Kafka NiFi Strong analytical technical Python PySpark

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 26th Nov 2020

JOB DETAIL

Bachelor's degree in Computer Science, Software Engineering, Information Technology, or related field required

• 8-10 years of overall experience in architecting and building large scale, distributed big data solutions.

• Experience in at least 2-3 Big Data implementation projects.

• Solid experience in Hadoop Ecosystem development including HDFS, Hive, Spark, Sqoop, Kafka, NiFi, and real time streaming technologies and host of big data open source stack.

• Working experience in Cloudera distribution is preferred.

• Experience in Python, PySpark(must have), Java (preferred). 

• Must possess excellent communication skills.

• Strong analytical, technical and trouble-shooting skills.

• Experience leading teams and/or managing workloads for team members.

• Nice to have working experience in Informatica BDM, StreamSets.

 

Please share the profiles at srinu@propelsys.com or else reach me on 469-443-4704

Company Information