Required Skills

Hadoop platform architecture Hardware specs Performance Benchmarking. Networking architecture Datacenter GIT Jira shell scripts shell scripts Hadoop Ecosystem HDFS

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Nov 2020

JOB DETAIL

  • Minimum 9+ years of work experience in Linux and 3 years of work experience in Big Data technologies.
  • Good knowledge about Hadoop platform architecture, Hardware specs, Performance Benchmarking.
  • Require good Unix skills and good knowledge about Networking architecture, Datacenter issues etc.
  • Experience in tool Integration, automation, configuration management in GIT, Jira platforms.
  • Should be proficient in writing shell scripts, automating the scripts and designing scheduler processes.
  • Should be able to understand Python/Scala/Java programs and debug the issues.
  • Require good understanding and knowledge about Hadoop Ecosystem, HDFS and Big Data concepts.
    Good experience in developing Ansible playbooks, Imaging etc.

Company Information