Required Skills

SQL Server MySQL DB2

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Sep 2022

JOB DETAIL

  • Over 8-10 years of experience in the field of Information Technology with proficiency in ETL design/development and Data Warehouse Implementation/development.
  • Experienced in Design, Development and Implementation of large - scale projects in Financial, Shipping and Retail industries using Data Warehousing ETL tools (Pentaho) and Business Intelligence tool.
  • Knowledge about Software Development Lifecycle (SDLC), Agile, Application Maintenance Change Process (AMCP).
  • Excellent data analysis skills.
  • Experience in Architecting and building Data Warehouse systems and Business Intelligence systems including ETL using Pentaho BI Suite (Pentaho Data Integration Designer / Kettle).
  • Hands-on experience on Data warehouse Star Schema Modeling, Snow-Flake Modeling, FACT & Dimension Tables, Physical and Logical Data Modeling.
  • Installed and configured Pentaho BI Server on different operating systems like Red Hat, Linux and Windows Server.
  • Hands on experience on the whole ETL (Extract Transformation & Load) process.
  • Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server.
  • Used bunch of steps in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.
  • Integrating Kettle (ETL) with Hadoop and other various NoSQL data stores can be found in the Pentaho Big Data Plugin. This is a Kettle plugin and provides connectors to HDFS, MapReduce, HBase, Cassandra, MongoDB, CouchDB that work across Pentaho Data Integration.
  • Loaded unstructured data into Hadoop File System (HDFS)
  • Experience in performing Data Masking/Protection using Pentaho Data Integration (Kettle).
  • Experience in writing shell scripting for various ETL needs.
  • Deep knowledge of RDBMS (SQL Server, MySQL, DB2 etc) and NoSQL databases such as MongoDB, DynamoDB and Cassandra.
  • Proficient in writing Confidential - SQL Statements, Complex Stored Procedures, Dynamic SQL queries, Batches, Scripts, Functions, Triggers, Views, Cursors and Query Optimization.
  • Quick understanding of Relational Source Database Systems and data models for building accurate transformation logic that can be used in Data Migration and Data Integration.
  • Good to have Supply Chain Knowledge.
  • Motivated team player with excellent communication, interpersonal, analytical and problem-solving skills.

Company Information