Required Skills

CASSANDRA Hadoop MongoDB Apache Spark HDFS YARN MapReduce PIG&HIVE Flume & Scoop

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Jan 2024

JOB DETAIL

•Must have 6+ years of ‘Recent’ experience working for a Major Bank or Brokerage house in the US.

•Must 12 yrs+ of experience maintaining applications utilizing Java, J2EE, SDLC, and WebSphere.

•Must have last 6 years experience working with CASSANDRA, Hadoop, MongoDB, Apache Spark, HDFS, YARN, MapReduce, PIG&HIVE, Flume & Scoop, and Zookeeper.

•Must have 6 years of experience maintaining Tier-1 data driven apps.

•Must have experience with 24/7 uptime and strict SLA.

•Extensive experience maintaining data pipelines, aggregate & transform raw data coming from a variety of data sources.

•Extensive experience optimizing data delivery and helping redesign to improve performance, handling, transformation, and managing BigData using BigData Frameworks.

•Extensive experience maintaining processed data parallel on top of distributed Hadoop storage using MapReduce.  

•Must have experience wit SOA -Design principles.

•Must have 5+ years programming in Scala, Java, Python or GO

•Must have 5+ years developing on Hadoop/Spark.

•Must have 6+ years developing on an RDBMS such as Microsoft SQL Server, and PostgreSQL.

•Must have experience with large data sets – regularly transforming and querying tables or sets of greater than 20 million records

•Exposure to data hygiene routines and models

•Experience in database maintenance.

•Ability to identify problems, and effectively communicate solutions to team.

Company Information