Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 9th Jun 2022




Architect, Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data.

Interact with engineering teams across geographies to leverage expertise and contribute to the tech community.

Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios.

Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums

Develop and/or Contribute to add features that enable customer analytics.

Deploy and monitor products on Cloud platforms

Develop and implement best-in-class monitoring processes to enable data applications meet SLAs


Our Ideal Candidate:


You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision- making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities.



Your Qualifications:


Min 5-10 years of Big data development experience

Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development

Design, develop, implement and tune distributed data processing pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.

Exposure to Data Governance ( Data Quality, Metadata Management, Security, etc.)

Mandatory Experience with Advanced Python (OOPS) to write data pipelines and data processing layers

Mandatory experience on GCP (DataProc, Big Query) and Airflow

Strong Experience in Spark Scala

Demonstrates expertise in writing complex, highly-optimized queries across large data sets

Proven, working expertise with Big Data Technologies Hadoop, HDFS, Hive, Spark Scala/PySpark, and SQL

Retail experience is huge plus


Company Information