Required Skills

Scala Spark Hadoop Kafka

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Oct 2023

JOB DETAIL

  • Monitor production jobs for P1 system (Data Management and Reporting – DMR).
  • Debug and restart in the event of production failures.
  • Acknowledge, respond and resolve production issues.
  • Understand change requests / enhancements requests.
  • Design and developed data loading strategies.
  • Build, develop, testing shared components that will be used across modules.
  • Extract source data files, stage, transform and load into EERA system (PostGres SQL)
  • Ingest data from Mainframe system using Kafka and load into EERA.
  • Extract source data files, stage, transform and load into BDS system (Hive based)
  • Use Apache Spark for large data processing integrated with functional programming language Scala
  • Create programs to continuously listen in requests from consumption layer, generate data extracts and publish it via email.
  • On call support during weekend
  • Handshake with offshore team on daily / need basis.

 

Mandatory Skills:

 

  • Scala, Spark, Hadoop, Kafka
  • Postgres SQL
  • Hive
  • Airflow (or Control M / Oozie)
  • Kubernetes (or equivalent containerization tool)
  • Agile
  • Strong architecture, design & coding skills
  • Experience in production operations and support
  • Experience in distributed databases

 

Company Information