Required Skills

Big Data

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 13th Mar 2023

JOB DETAIL

  • Over All experience must be 8 years
  • Experience in setting up production Hadoop clusters with optimum configurations
  • Drive automation of Hadoop deployments, cluster expansion and maintenance operations
  • Manage Hadoop cluster, monitoring alerts and notification
  • Job scheduling, monitoring, debugging and troubleshooting
  • Monitoring and management of the cluster in all respects, notably availability, performance and security
  • Data transfer between Hadoop and other data stores (incl relational database)â¿Set up High Availability/Disaster Recovery environment
  • Debug/Troubleshoot environment failures/downtime
  • Performance tuning of Hadoop clusters and Hadoop Map Reduce routines
  • Skills Experience and Requirements
  • Experience with Kafka, SPARK etc
  • Experience working with AWS big data technologies (EMR, Redshift, S3, Glue, Kinesis, Dynamodb, and Lambda)
  • Good knowledge on creation of Volumes, Security group rules, Key pairs, Floating IPs, Images and Snapshots and Deploying Instances on AWS
  • Experience configuring and/or integrating with monitoring and logging solutions such as syslog, ELK (Elastic, Logstash, and Kibana)
  • Strong UNIX/Linux systems administration skills, including configuration, troubleshooting and automation
  • Knowledge of Airflow, NiFi, Streamsets, etc
  • Knowledge of container virtualization

 

Company Information