Required Skills

Software Engineering Information Technology or related field required architecting and building large scale distributed big data solutions. Big Data implementation HDFS Hive Spark Sqoop Kafka NiFi Python PySpark Java.

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 25th Nov 2020

JOB DETAIL

 

 

 

 

 

• Bachelor's degree in Computer Science, Software Engineering, Information Technology, or related field required

 

 

 

 

 

• 10-12 years of overall experience in architecting and building large scale, distributed big data solutions.

 

 

 

 

 

• Experience in at least 2-3 Big Data implementation projects.

 

 

 

 

 

• Solid experience in Hadoop Ecosystem development including HDFS, Hive, Spark, Sqoop, Kafka, NiFi, and real time streaming technologies and host of big data open source stack.

 

 

 

 

 

• Working experience in Cloudera distribution is preferred.

 

 

 

 

 

• Must have Experience in Python, PySpark, Java. 

 

 

 

 

 

• Must possess excellent communication skills.

 

 

 

 

 

• Strong analytical, technical and trouble-shooting skills.

 

 

 

 

 

• Experience leading teams and/or managing workloads for team members.

 

 

 

 

 

• Nice to have working experience in Informatica BDM, StreamSets

 

 

 

 

 

Thanks and Regards,

 

Harishwar Reddy

 

harishwar.r@axiustek.com <mailto:harishwar.r@axiustek.com>

 

Direct : 201-898-2218

 

Mobile : 201-898-2218

 

 

 

A women minority certified, diversity and e verified company

 

____________________________________

 

Axius Technologies Inc.

 

Company Information