Required Skills

ETL

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Mar 2024

JOB DETAIL

Works alongside Information & data Architects, Engineers, Data Scientists, and other stakeholders to design and maintain moderate to advanced data models. Responsible for designing, building, and maintaining large-scale databases that support web applications and other digital services. Responsible for developing ETL data pipelines and helping with data lineage requirements implementations and supporting reports that provide accurate and timely data for internal and external clients. This role requires familiarity of data architecture, data sources, data flow and extraction & manipulation of data sets of various sizes.

Position Responsibilities:
Design and Methodology

Design, build, test and document solutions to implement, validate data lineage, solutions and stabilize data pipeline to ensure reliable operations. Design, implement and operate medium to large-scale, high-volume, high-performance data structures for reporting, analytics, and data science. Define, implement robust data lineage rules and standards, ensure thorough testing and validation is performed to create a quality product Must have a working experience with data modeling, data mapping, transformation and lineage. Implement data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes by leveraging best ETL data tools. Ensure data cataloging, lineage requirements are reviewed, implemented, and tested during the ETL process implementations. Design, develop, manage and build real-time data pipelines from a variety of sources. Collaborate with engineers to adopt best practices in data integrity, design, analysis, testing, validation, and documentation. Coordination with other teams to design optimal patterns for data ingest and egress and coordinate data quality initiatives and troubleshooting. Participate in sprint planning meetings as needed. Continually improve data lineage with the ETL automation and support production activities. Review and are familiar with automated processes for performance and fault tolerance. Design and implement security measures to protect data from unauthorized access or misuse. Keeps management informed of status of on activities through accurate, timely, and appropriate reporting. Contribute to Data Governance, system documentation and sharing of data asset knowledge.
Position Qualifications:

Bachelor's Degree in computer science, engineering or in a technology related field OR equivalent through a combination of education and/or technology experience OR 12 years of technology experience 6 years of experience using Informatica, 3 years IICS and Snowflake ETL development experience with Python programming, SQL 5 years of experience in Data Engineering, developing, data cataloging, lineage end-to-end scalable data applications and data pipelines 5 years of working knowledge of different databases (e.g., SQL & NoSQL) and AWS cloud technologies 4 years of experience developing strong collaborative relationships with key data, business partners 3 years of experience working with software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes and testing 2 years of experience working in enterprise data warehouse solutions and platforms
Licenses/Certificates:

Informatica IICS certification

Amazon Web Services (AWS) Certified Solution Architect

Company Information