Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 22nd Nov 2023

JOB DETAIL


The Senior Data Engineer will play a critical role on the Data and Analytics team responsible for transforming data from disparate systems to provide insights and analytics for business stakeholders. This role will leverage cloud-based infrastructure to implement technology solutions that are scalable, resilient, and efficient. It will collaborate with Data Engineers, Data Analysts, Data Scientists, DBAs, cross-functional teams, and business partners.

Job Description

  • Architect, design, implement and operate data engineering solutions, using Agile methodology, that empower users to make informed business decisions.
  • Continuously improve quality, efficiency, and scalability of data pipelines.
  • Writing test cases to ensure data quality, reliability, and elevated level of confidence.
  • Mentoring junior team members through code reviews and recommending adherence to best practices.
  • Advancing modern technologies to improve data quality and reliability.
  • Working within the full life data lifecycle, ensuring high-quality data plays across applications, machine learning, business analytics, and reporting.
  • Exhibiting solid critical thinking skills, the ability to synthesize complex problems, and a talent for transforming data to create solutions that add value to a myriad of business requirements.
  • Demonstrating the ability to facilitate and take ownership of assigned technical projects in a fast-paced environment.

EXPERIENCE REQUIRED

  • 7+ years of professional experience, including: 4+ years development experience building and maintaining ETL (Extract, Transformation and Load) pipelines.
  • 3+ years of Python development experience, 1+ year of Spark development.
  • Experience with AWS (Amazon Web Services) integrations such as Kinesis, Firehose, Aurora Unload, Redshift, Spectrum, Elastic MapReduce, SageMaker and Lambda.
  • Expert skills working with SQL queries, including performance tuning, utilizing indexes, and materialized views to improve query performance.
  • Advanced knowledge of both OLTP and OLAP environments with successful implementation of efficient design concepts.
  • Proficiency with the design and execution of NoSQL database to optimize Big Data storage and retrieval.
  • Experience with API code integrations with external vendors such as Salesforce, Google Analytics to push/pull data between organizations.
  • Familiarity with data orchestration pipeline using Argo or Airflow.
  • Experience with real time technologies such as Kafka, Flink, Spark Structured Streaming is a plus.
  • Knowledge of analytic tools such as R, Tableau, Python Pandas is a plus.
  • Financial services industry experience is a plus.
  • Bachelor of Science degree in Computer Science or equivalent.

Company Information