Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Dec 2025

JOB DETAIL

  • A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
  • Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.
  • Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.
  • Participate in peer review and provide feedback to the engineers keeping development best practices, business and technical requirements in view
  • Determine best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).
  • Work with business and cross-functional teams to gather and document requirements to meet business needs.
  • Provide support as required to ensure the availability and performance of ETL/ELT jobs.
  • Provide technical assistance and cross training to business and internal team members.
  • Collaborate with business partners for continuous improvement opportunities.

Requirements

JOB SPECIFICATIONS:

Education: Bachelor's Degree in Computer Science, Information Technology, Engineering, or related field

Experience, Skills & Qualifications:

  • 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
  • 4+ years of experience with one of the leading public clouds.
  • 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
  • 4+ years of experience with Python, Scala with working knowledge on Notebooks.
  • 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
  • At least 2 years of experience in Data governance and Metadata Management.
  • Ability to work independently, solve problems, update the stake holders.
  • Analyze, design, develop and deploy solutions as per business requirements.
  • Strong understanding of relational and dimensional data modeling.
  • Experience in DevOps and CI/CD related technologies.
  • Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.

Company Information