Required Skills

Ab Initio C++ core.

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Dec 2023

JOB DETAIL

Job Summary:
We’re in the middle of transforming a high volume, highly visible, mission critical “Big Clinical Measure” batch application from an On-Prem Hadoop platform (native Hadoop tools w/ C++ core) to a containerized solution on the GCP platform using Ab Initio (w/ the C++ core.). We are seeking an experienced Lead Ab Initio developer who have done containerization, ideally on GCP. 

  

Experience:

A minimum 9  years of experience as an Ab Initio developer . Candidate should have Lead and Hands on Experience in Ab Initio 

 

Mandatory Skill set 

  • Ab Initio: Proficiency in Ab Initio software, including GDE (Graphical Development Environment) and Co-Operating System. 
  • ETL Expertise: Strong knowledge of ETL concepts and best practices. 
  • Data Warehousing: Familiarity with data warehousing concepts and methodologies. 
  • SQL: Proficiency in SQL for data manipulation and query. 
  • Scripting: Experience with Unix/Linux scripting is a plus. 
  • Problem-Solving: Strong problem-solving and analytical skills. 
  • Communication: Excellent communication and interpersonal skills for collaboration with team members and stakeholders. 
  • Familiarity with Big Data technologies (e.g., Hadoop) and GCP 

Certification in Ab Initio is a plus. 


Responsibilities: 

  • ETL Development: Design, develop, and maintain ETL processes using Ab Initio for data integration and data warehousing. 
  • Data Transformation: Create transformations to cleanse, enrich, and transform data as per business requirements. 
  • Data Quality: Ensure data quality and accuracy by implementing data validation and cleansing processes. 
  • Performance Optimization: Tune ETL processes to improve performance and efficiency. 
  • Integration: Collaborate with cross-functional teams to integrate ETL solutions with other systems and applications. 
  • Documentation: Maintain clear and up-to-date documentation for ETL processes and data mappings. 
  • Testing: Perform unit testing and participate in integration testing to ensure the reliability of ETL processes. 
  • Troubleshooting: Investigate and resolve ETL-related issues, including data discrepancies and performance bottlenecks. 
  • Monitoring: Implement monitoring solutions to proactively identify and address any issues in ETL jobs. 

  

Company Information