Required Skills

MapReduce Flume Pig Scala Python Java

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Apr 2024

JOB DETAIL


  • The candidate will be responsible for Design, build and maintenance of security-enabled Big Data workflows/pipelines to process billions of records into and out of our Hadoop Distributed File System (HDFS).
  • The candidate will engage in sprint requirements and design discussions. 
  • The candidate will be proactive in troubleshooting and resolving data processing issues.
  • The candidate should be highly accountable, self-starter; possess strong sense of urgency; can work autonomously with limited direction.?

Skills Requirements:

FOUNDATION FOR SUCCESS(Basic Qualifications)

  • Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.  
  • Masters or Doctorate degree may substitute for required experience
  • 7+ years experience in Hadoop, bigdata, kafka, data analysis.
  • Must be able to obtain and maintain a Public Trust. Contract requirement. 

Company Information