Required Skills

AWS Redshift Python/PySpark Airflow SQL Stored Procedures

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 2nd Nov 2024

JOB DETAIL

  • ETL/Data warehousing experience of a minimum of 10 years.
  • Understanding the overall Architecture of the system and ensure the deliverables are in-line with the proposed architecture. Strong in-depth knowledge of RDBMS and SQL.
  • Understand source, target data models, and Develop AWS Glue ETL Pipelines using PySpark.
  • Develop Stored Procedures focusing development of reusable components.
    Analyze data processing requirements, source data, and target data models to develop complex SQL queries for data consumption by downstream applications.
    Unit Testing and Unit Test Log preparation
    Provide QA /UAT/Warranty Support in co-ordination and resolution of observations and defects raised in QA/UAT/Prod
  • Understanding and implementing DMS jobs.
  • Develop components and reusable assets using other AWS Services as applicable like DMS, Lambda, S3, CloudFormation, Airflow.
  • Develop components and reusable assets using IICS in Data integration/Data Warehousing landscape (Secondary skills, nice to have)

Company Information