Required Skills

Python Big Data Hadoop Big data Big data batch HDFS HIVE Pig Spark Hadoop Pyspark UNIX ETL Spark SQL

Work Authorization

  • Us Citizen

  • Green Card

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Nov 2020

JOB DETAIL

  •          Strong object oriented programming skills Deep expertise and hands on programming experience in Python and Big Data technologies
  •          Good understanding of Hadoop Big data concepts is a must Automation tool development  for building interfaces with Big data batch and streaming tools
  •          Should have experience in developing interfaces with  Big data batch and streaming tools within the Hadoop Ecosystem such as HDFS HIVE Impala Pig Spark Hadoop etc
  •          Good Experience on Pyspark and open source technologies like Kafka Storm Flume HDFS
  •          Must develop spark program using Spark core and Spark SQL jobs as per requirement
  •          Work independently and develop automation tools solution with minimal guidance
  •          Possess sufficient knowledge and skills to effectively deal with issues challenges within field of specialization to develop simple applications solutions
  •          Strong analytical and problem solving skills UNIX  Linux scripting to perform ETL on Hadoop platform
  •          Work with other team members to accomplish key development tasks
  •          Good to have Scala knowledge
  •          Having Teradata knowledge or background would be a plu

 

Regards

 

Vijayakumar R

Technical Recruiter

Smart IT Frame

(Direct No) +1  732-992-3135  EXT 176

EMAIL : vijayr@smartitframe.com

LinkedIn : linkedin.com/in/vijay-k-0a0aa9165

Company Information