Us Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 27th Nov 2020
• Bachelor's degree in Computer Science, Software Engineering, Information Technology, or related field required
• 8-10 years of overall experience in architecting and building large scale, distributed big data solutions.
• Experience in at least 2-3 Big Data implementation projects.
• Solid experience in Hadoop Ecosystem development including HDFS, Hive, Impala, Spark, Sqoop, Kafka. Would be good to have experience real time streaming technologies and related big data open source stack technologies and AWS technologies.
• Working experience in Cloudera distribution is preferred.
• Experience in Python, PySpark(must have), Java (preferred). Solid experience in UNIX, shell-scripts and familiarity with Git and GxP processes.
• Must possess excellent communication skills. Must be able to create documentation and work in
Bachelor's degree in Computer Science, Software Engineering, Information Technology, or related field required
• 8-10 years of overall experience in architecting and building large scale, distributed big data solutions.
• Experience in at least 2-3 Big Data implementation projects.
• Solid experience in Hadoop Ecosystem development including HDFS, Hive, Spark, Sqoop, Kafka, NiFi, and real time streaming technologies and host of big data open source stack.
• Working experience in Cloudera distribution is preferred.
• Experience in Python, PySpark(must have), Java (preferred).
• Must possess excellent communication skills.
• Strong analytical, technical and trouble-shooting skills.
• Experience leading teams and/or managing workloads for team members.
• Nice to have working experience in Informatica BDM, StreamSets