Required Skills

Hadoop Hive Spark Mapreduce. REST API’s. CI/CD

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Jan 2021

JOB DETAIL

Job Title: Data Engineer
Location: Raleigh, NC
Duration: 2+ Years

Skills/qualifications:
•    At least 8 plus years of experience in software engineering developing data pipelines, API’s and integration technologies.
•    Experience with one or more of the following programming scripting languages – Python, JVM-based languages, or Javascript, and ability to pick up new languages.
•    Experience in stream data, such as Kafka.
•    Experience in Big Data technologies – Hadoop, Hive, Spark, Mapreduce.
•    Experience with Change Data Capture (CDC) and Extract, Transform and Load (ETL) tools.
•    Experience with building Data Pipelines that utilize Kafka or another pub/sub tool.
•    Experience with Cloud implementation, preferably AWS.
•    Experience with Master data Management (MDM) tools is a plus.
•    Experience with Snowflake & Databricks is a plus
•    Strong expertise with GIT.
•    Experience creating REST API’s.
•    Experience in building CI/CD pipelines.
•    Proficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD).
•    Experience in building a software ‘product’ which will be transitioned to others for support and further evolution (including documentation, complete tooling/instrumentation, etc.).

Company Information