Required Skills

ETL Python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 26th Feb 2024

JOB DETAIL

  • At least 5 years of full-time development experience using Python.
  • Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks.
  • Direct experience of building data pipelines using Azure Data Factory and (preferably Databricks).
  • Extensive experience in software development and the entire SDLC.
  • Solid understanding of a variety of programming tools and development platforms.
  • Experience in creating high-level product specifications and design documents.
  • Experience in writing Python applications with the benefit of frameworks like Django, Flask, Pyramid, or Tornado.
  • Experience in Python testing and code analysis tools like Pytest and Pylint.
  • Integration experience (ETL, ELT) with Python.
  • Strong SQL skills.
  • Familiarity with SSIS.
  • General development expertise, use of version control, ticketing, and continuous integration systems.
  • Experience in using an Enterprise Scheduler (Tidal)
  • Experience in an Agile Development environment
  • On-point communication skills to concisely report status of work/issues and next action, and articulate technical complexity to Business analysts, Project manager and Business users

 

 

Responsibilities:


 Migrate existing SSIS ETL scripts to Python; develop new ETL scripts
 Support existing SSIS SQL Projects
 Maintain ETL pipelines in and out of data warehouses using a combination of Python and
Snowflakes SnowSQL
 Write SQL queries against Snowflake.
 Understanding data pipelines and modern ways of automating data pipelines using cloud-
based
 Work closely with existing senior integration staff to flush out design, priority, and build.
 Scaffolding and framework will be needed for staging and transforming datasets.
 Use the existing DevOps pipeline for Python and enhance it if necessary.
 Automate Python scripts on our enterprise scheduler.
 Strong troubleshooting skills to identify root cause and resolve production issues
Skills:
 At least 5 years of full-time development experience using Python.
 Designing and implementing highly performant data ingestion pipelines from multiple
sources using Azure Databricks.
 Direct experience of building data pipelines using Azure Data Factory and (preferably
Databricks).
 Extensive experience in software development and the entire SDLC.
 Solid understanding of a variety of programming tools and development platforms.
 Experience in creating high-level product specifications and design documents.
 Experience in writing Python applications with the benefit of frameworks like Django,
Flask, Pyramid, or Tornado.
 Experience in Python testing and code analysis tools like Pytest and Pylint.
 Integration experience (ETL, ELT) with Python.
 Strong SQL skills.
 Familiarity with SSIS would be helpful.
 General development expertise, use of version control, ticketing, and continuous
integration systems.
 Experience in using an Enterprise Scheduler (Tidal)
 Experience in an Agile Development environment
 On-point communication skills to concisely report status of work/issues and next action,
and articulate technical complexity to Business analysts, Project manager and Business
users
 Education:
 Bachelor's degree in Computer Science or Finance

Company Information