Required Skills

Data Fabric - Big Data Processing - Apache Spark

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 1st Dec 2025

JOB DETAIL

  • 10 to 15 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience.

  • Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture.

  • Proficiency in building end to end data platforms and data services in GCP is a must.

  • Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub.

  • Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer. candidates with this experience.

  • Experience building Symantec layers.

  • Remote position, engineers will work Eastern US business hours.

 

Company Information