Required Skills

UNIX Linux ETL Scala Spark SQL Kafka HDFS Big data Python

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Nov 2020

JOB DETAIL

  • Strong object oriented programming skills Deep expertise and hands on programming experience in Python and Big Data technologies
  • Good understanding of Hadoop Big data concepts is a must Automation tool development  for building interfaces with Big data batch and streaming tools
  • Should have experience in developing interfaces with  Big data batch and streaming tools within the Hadoop Ecosystem such as HDFS HIVE Impala Pig Spark Hadoop etc
  • Good Experience on Pyspark and open source technologies like Kafka Storm Flume HDFS
  • Must develop spark program using Spark core and Spark SQL jobs as per requirement
  • Work independently and develop automation tools solution with minimal guidance
  • Possess sufficient knowledge and skills to effectively deal with issues challenges within field of specialization to develop simple applications solutions
  • Strong analytical and problem solving skills UNIX  Linux scripting to perform ETL on Hadoop platform
  • Work with other team members to accomplish key development tasks
  • Good to have Scala knowledge

Please share the matched resumes for this.

Sikander Singh Charak
Sr. US IT Recruiter
Peritus Inc.
222 West Las Colinas Blvd, Suite# 745 East, lrving, TX 75039
Email ID: sikander.c@peritussoft.com

Company Information