Required Skills

BigQuery Cloud Functions Cloud Run Dataform Dataflow Dataproc SQL Python Airflow PubSub

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 7th Jun 2024

JOB DETAIL

  • As a senior/principal engineer, you will be responsible for ideation, architecture, design and development of a new enterprise data platform. You will collaborate with other cloud and security architects to ensure seamless alignment within our overarching technology strategy.
  • Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies.
  • Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.
  • Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.
  • Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.
  • Architect and design the best security patterns and practices.
  • Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.
  • Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.
  • Design and develop advanced analytics and machine learning capabilities on the data platform.
  • Design and develop observability and data governance frameworks and practices.
  • Stay up to date with the latest data engineering trends, technologies, and best practices.
  • Drive the deployment and release cycles, ensuring a robust and scalable platform.

 

Requirements:

  • 10+ (for senior) 15+ (for principal) of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience.
  • Prior experience architecting and building successful enterprise scale data platforms in a green field environment is a must.
  • Proficiency in building end-to-end data platforms and data services in GCP is a must.
  • Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub.
  • Experience with Microservices architectures - Kubernetes, Docker and Cloud Run
  • Experience building Symantec layers.
  • Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads.
  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows.
  • Hands-on experience with GCP ecosystem and data lakehouse architectures.
  • Strong understanding of data modeling, data architecture, and data governance principles.
  • Excellent experience with DataOps principles and test automation.
  • Excellent experience with observability tooling: Grafana, Datadog.

Company Information