Required Skills

JVM Java language Spark/PYSPARK Apache Nif HBase Kafka ZooKeeper

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 21st Jan 2021

JOB DETAIL



Responsibilities
• Write software to interact with HDFS and MapReduce.
• Assess requirements and evaluate existing solutions.
• Build, operate, monitor, and troubleshoot Hadoop infrastructure.
• Develop tools and libraries, and maintain processes for other engineers to access data and write MapReduce programs.
• Develop documentation and playbooks to operate Hadoop infrastructure.
• Understand Hadoop’s security mechanisms and implement Hadoop security.
Skills
• Knowledge of JVM runtime, the Java language, and ideally another JVM-based programming language.

• Demonstrated proficiency in Apache Nifi and Spark/PYSPARK

• Proficient at software engineering principles that produce maintainable software and you can use them in practice.
• Have worked with a Hadoop distribution.
• Familiar with HBase, Kafka, ZooKeeper, or other Apache software.

• Knowledge of Linux and its operation, networking, and security

Familiar with Cloudera 6.3.2

Company Information