Required Skills

Data warehouse ETL Linux python Data Frames Oracle/SQL

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Contract

  • 1099-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :-

Other Information

  • No of position :- ( 1 )

  • Post :- 28th Dec 2020

JOB DETAIL

Position: Data Architect

Location: Charlotte, NC

Duration: Long Term Contract

 

This is an exciting opportunity to be at the forefront of developing new data engineering methods of transporting, manipulating and conforming data for business consumption.

Corporate Investments Data Warehouse (CIDW) and Transaction Hub (THUB) are data warehouses supported by the Data Horizontal team within Risk & Finance Technology. CIDW is both a general purpose LOB data store and calculation engine supporting the needs of Corporate Treasury (Corporate Investments, Global Funding, Finance, Market Risk, etc.) as well as the Enterprise Authorized Data Source (ADS) for Cash & Cash Equivalents, Intercompany Loans and Long Term Debt. THUB is a data store of Intrader (3rd party hosted SOR) position & transactional data.

 

Agile teams consist of a scrum master, developers, data quality analysts and data / business analysts who support front office, middle office, market risk and finance users in collecting, transforming, loading and reporting end of day and intraday fixed income and derivative trading positions and other financial data.

 

The role is for a Data Engineer / Architect. As a Data Engineer / Architect, you will be expected to help the team craft data solutions to meet business and enterprise requirements. While our core stack is currently Informatica / Oracle / SQL, we are exploring new methods of moving data. Candidates experienced with Big Data Technologies such as Hadoop, Kafka, Spark, Hive, NiFi and with Python and/or Java are strongly encouraged to apply. This person is expected to bring new technology, knowledge and experience to the team and mentor others team members.

 

Required Skills:

  • Minimum of 10+ years of development experience in Oracle, SQL Server, Netezza, or another industry accepted database platform.
  • Minimum of 7+ years in Data Warehouse / Data Mart / Business Intelligence delivery.
  • Minimum of 3+ years with an Industry ETL tool (preferably Informatica PowerCenter).
  • Minimum of 3+ years of Linux / shell scripting (e.g. Bash, Perl, Python).
  • Minimum of 5+ years of Python (e.g. Pandas, Data Frames) and use in data processing solutions.
  • Experience w/d enterprise job scheduling tool (e.g. Autosys, Airflow).
  • Proven experience in designing and building integrations supporting standard data warehousing data models star schema, snow flake and different Normalization Forms.
  • Strong analytical and problem-solving skills. Passion for working with data.
  • Experience with two or more of the following:
  • Java and its use in implementing web services and data processing solutions.
  • Experience developing Data Pipeline solutions w/ Python based methodologies as opposed to industry standard ETL tools.
  • Modern job orchestration tools for data pipelines such as Air Flow.
  • Integrating Rules Engines (e.g. Sapiens) into Data Pipeline workflows.
  • Big Data and/or Emerging data technology tools and methodologies.
  • Kafka, Sqoop, Spark, nifi.
  • Data Wrangling tools such as Alteryx and/or Trifacta.
  • Data Visualization tools such as Tableau and/or Microstrategy.
  • Bachelor’s degree in STEM related field.

 

Desired Skills:

  • 10+ years Data Engineering experience.
  • Banking / Capital Markets / Accounting domain knowledge.
  • Experience automating QA tests as part of the development workflow.
  • Experience in Creating Low-level and High-level Design Artifacts.
  • Ability to present technical concepts to senior level business stakeholders.
  • Excellent Communication skills – verbal and written.
  • Should be a self-motivated worker.
  • Excellent interpersonal skills, positive attitude, team-player.
  • Willingness to learn and adapt to changes.
  • Experience in working in a global technology development model.
  • Effectively deal with multiple deadline-driven, customer-sensitive projects and tasks.
  • Knowledge of agile methodology and frameworks like Scrum, Kanban, etc.
  • Experience working in a SAFe Agile delivery model.
  • Advanced degree.

Company Information