Required Skills

Data Engineer ETL AWS Oracle Redshift

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 6th Jan 2024

JOB DETAIL

  • Works with business stakeholders and IT to translate business logic into scalable data and analytic solutions
  • The Financial Services Data Engineers work as a centralized, but dedicated team. The lead data engineer will primarily support a specific Financial Services team directly but will also support other teams as needed
  • Works with business stakeholders to ensure Financial data is decision ready and reliably meets quality standards defined by stakeholders throughout the data lifecycle
  • Develops ETL pipelines, typically leveraging existing patterns as available for specific project execution
  • Identifies scalable approaches for resolving data quality or consistency issues
  • Works with various Financial Services stakeholders to understand, document, and scope process improvements needed for new and existing financial data pipelines
  • Maintains a solid understanding of the Client suite of data enginnering tools (dbt, AWS, Apache Airflow, Databricks, Alteryx, API’s)
  • Leads implementation of new capabilities, processes, and technical patterns within Financial Services
  • Leads and is responsible for multiple data projects across Financial Services
  • Works with IT and Enterprise Analytics to develop data models to enable data science and analytical requests from Financial Services staff
  • Owns and is accountable for data model and code quality as well as relevant documentation
  • Learns and maintains business context of the Financial Services team they are supporting through data

Minimum Qualifications

  • Strong analytical and data modeling skills
  • Solid understanding of database technology
  • Strong skills in SQL and Python
  • Experience using GitHub or similar code repository
  • Fast learner and proven problem solver
  • Excellent oral and written communication skills
  • Business-results and customer-focused orientation. Seeks to understand business needs and works to anticipate, identify, and meet end-user needs

Preferred Qualifications ( but are a great importance to the client for the Data Architecture. )

  • Experience using the AWS big data technology stack, e.g., AWS S3, Redshift
  • Experience in Apache Airflow
  • Experience using data visualization tools
  • Experience implementing data governance principles
  • Experience with Oracle (ERP) Financial modules
  • Experience with dbt
  • Experience with Databricks

Company Information