Required Skills

Python ANSI SQL PLSQL

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 16th Dec 2023

JOB DETAIL

We are currently seeking an experienced Data Engineer to join the Big Data and Advanced Analytics department. The Data Engineer will work closely with business domain experts to create an Enterprise Data Lakehouse to support data analytic use cases for midstream oil and gas business units. This individual will provide analytical and technical leadership to the team to advance the data engineering practice within the organization.

Responsibilities:

  • Design and implement reliable data pipelines to integrate disparate data sources into a single Data Lakehouse
  • Design and implement data quality pipelines to ensure data correctness and building trusted datasets
  • Design and implement a Data Lakehouse solution which accurately reflects business operations
  • Develop and maintain reusable data building blocks
  • Assist with data platform performance tuning and physical data model support including partitioning and compaction
  • Provide guidance in data visualizations and reporting efforts to ensure solutions are aligned to business objectives

Qualifications:

  • 7+ years of experience as a Data Engineer designing and maintaining data pipeline architectures
  • 5+ years of programming experience in Python, ANSI SQL, PLSQL, and TSQL
  • 3+ years implementation experience on modern data stack technologies including cloud data warehouses, data lakes, and data lakehouses
  • Experience with various data warehouse design methodologies including dimensional modelling, non-volatile, time-variant, function-centric, and unification-centric design
  • Experience in cleansing, curating, and organizing complex datasets and data sources
  • Experience in software development practices such as Design Principles and Patterns, Testing, Refactoring, CI/CD, and version control
  • Knowledgeable of modern data platform technologies including Apache Airflow, Kubernetes, and S3 Object Storage
  • Experience with Dremio, DBT, and Airbyte is preferred

Company Information