Required Skills

Hadoop. Hive MR Hive Tez Blaze Data analysis data mapping data loading

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Dec 2020

JOB DETAIL

Position: Informatica Developer

Location: 100% Remote

 

Job Description:

  • 5-7 years of experience in Information Technology Experience in Healthcare or Health Insurance Industry.
  • Extensive experience in Informatica BDM is required.
  • Experience at Enterprise-level ETL development and ETL architecture using Informatica.
  • Strong Informatica technical knowledge in the areas on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Good knowledge of analytical and logical conceptual level programming skills.
  • Good knowledge of Erwin model generation tools (logical and physical); Data analysis, data mapping, data loading, and data validation;
  • Understand reusability, parameterization, workflow design, etc.;
  • Understanding of Entire life cycle of Software and various Software Engineering Methodologies;
  • Should be proficient in Informatica Big Data Management - Develop and run dynamic mappings. Should be able to automate metadata changes through dynamic mappings.
  • Should be able to identify the optimization methods used by the Informatica Smart Executor.
  • Should know how to utilize Informatica Big Data management edition to monitor and troubleshoot Hadoop.
  • Familiarity with Hadoop Architecture Components-The Hadoop Distributed File System (HDFS), MapReduce , Yet Another Resource Manager†(YARN) will be preferable.
  • Working knowledge of Teradata and BTEQ scripts is preferred. Should have worked in projects with Teradata as one of the data sources for Informatica.
  • Migrate Informatica PowerCenter mappings to Informatica Big Data Management and ingest data into Hadoop.
  • Should have used Sqoop to ingest data into Hadoop.
  • Leverage the capabilities of the Informatica engines on Hadoop including Hive MR, Hive Tez, Blaze, and Spark engines.

Company Information