Required Skills

Hadoop Administrator

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 16th Sep 2021

JOB DETAIL

In this job, you’ll design, build, scale and maintain the infrastructure in both datacenter and AWS to support Big Data applications.

You will design, build, and own the end-to-end availability of Big Data platform in both AWS and datacenter.

You will improve the efficiency, reliability, and security of our Big Data infrastructure, while making sure that our developers & analysts have a smooth experience with it.

You will work on automation to build and maintain new platform on AWS.

You will build custom tools to automate day-to-day operational tasks.

You will be responsible for setting the standards for our production environment.

You will take part in 24X7 on-call rotation with the rest of the team and respond to pages and alerts to investigate issues in our platform. 

Strong experience with Hadoop ecosystem like HDFS, Yarn, Hive, Spark, Oozie, Presto and Ranger

Strong experience with AWS including EMR, RDS and good understanding on IaaS, PaaS 

Strong foundation of Hadoop security including SSL/TSL, Kerberos, Role based authorization 

Performance tuning experience of Hadoop clusters, ecosystem components and MR/Spark jobs

Experience with Infrastructure automation using Terraform, CI/CD pipelines (GIT, Jenkins etc), and configuration management tools like Ansible 

Able to leverage technologies like Kubernetes/Docker (ELK) to help our Data Engineers/Developers scale their efforts in creating new and innovative products.

Experience with providing and implementing monitoring solutions based on logs using CloudWatch, CloudTrail and Lambda etc.

Ability to do post-mortem if something bad happens to your systems. Identify what went wrong and provide detailed RCA. 

Proficiency in Bash & Python or Java 

Good understanding of all aspects of JRE/JVM and GC tuning 

Hands-on experience with RDBMS (Oracle, MySQL) and basic SQL 

Experience on Kafka in large environment is plus

Hands-on experience with Snowflake is a plus

Hands-on experience with Qubole and Airflow is a plus

 

Company Information