Required Skills

HDFS. MapReduce MapReduce Impala Hive Hadoop Python. Big Data Domain

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 21st Nov 2020

JOB DETAIL

•             Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities

•             Implementing ETL processes using Spark

•             Ensure optimization of software through design reviews and code reviews

•             Monitoring performance and advising any necessary infrastructure changes & defining data retention policies

•             Management of Hadoop cluster, with all included services

•             Building stream-processing systems, using solutions such as Storm or Spark-Streaming

•             Solve any ongoing issues with operating the Hadoop cluster

Qualifications:

•             Proficient understanding of distributed computing principles.

•             Proficiency with Hadoop, MapReduce, HDFS.

•             Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala

•             Strong understanding of Data Structures and Algorithms.

•             Experience with integration of data from multiple data sources

•             Experience with various messaging systems, such as Kafka or RabbitMQ

•             Experience in scripting languages such as Python.

•             Experience with Cloudera/MapR/Hortonworks Apache HDFS

•             Experience in the field of Automotive Telematics Software is a big plus.

•             Experience with scripting tools and methods to optimize SW development and testing activities.

•             At least 4-7+ years of industry experience in Big Data Domain

 

Company Information