Required Skills

Python Big Data Hadoop Big data Automation Big data batch Hadoop Ecosystem Spark SQL Kafka HDFS UNIX Linux ETL Scala Teradata

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Nov 2020

JOB DETAIL

  • Strong object oriented programming skills Deep expertise and hands on programming experience in Python and Big Data technologies
  • Good understanding of Hadoop Big data concepts is a must Automation tool development  for building interfaces with Big data batch and streaming tools
  • Should have experience in developing interfaces with  Big data batch and streaming tools within the Hadoop Ecosystem such as HDFS HIVE Impala Pig Spark Hadoop etc
  • Good Experience on Pyspark and open source technologies like Kafka Storm Flume HDFS
  • Must develop spark program using Spark core and Spark SQL jobs as per requirement
  • Work independently and develop automation tools solution with minimal guidance
  • Possess sufficient knowledge and skills to effectively deal with issues challenges within field of specialization to develop simple applications solutions
  • Strong analytical and problem solving skills UNIX  Linux scripting to perform ETL on Hadoop platform
  • Work with other team members to accomplish key development tasks
  • Good to have Scala knowledge
  • Having Teradata knowledge or background would be a plus

Warm Regards,
Abhishek Singh
Sr. Technical Recruiter
Peritus Inc.
Contact: +1 (972) 214-2378
Email ID: abhishek.s@peritussoft.com
LinkedIn: https://www.linkedin.com/in/abhishek-singh-22558ba6/

Company Information