Required Skills

strong UNIX shell scripting SQOOP eclipse HCatalog .

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Jul 2021

JOB DETAIL

 *Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience. Master’s degree (in a technical related subject) preferred but not required.

•             *Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable. Furthermore, ability to direct / manage a team of integration designers, developers, and testers in building large scale, complex integrations throughout a modern data ecosystem.

•             *Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster with extensive experience in Integration (Integration includes ETL, message-based, streaming and API styles of integration) with tools preferable Talend Data Integration, Talend Big Data migration platform Edition 6.2.1 or comparable toolsets, and Data Warehousing. Talend is HCSC preferred tool for data integration and Integration. If you have extensive experience with some other tool, you are expected to be able to transfer these skills into Talend tools within 30-60 days. HCSC is committed to placing experienced resources and as such adopts a CodeVue test approach for potential candidates.

•             *Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog .

•             *Must have experience with NoSql Databases like HBASE, Mongo, CosmoDB, Graph Databases or Cassandra

•             *Must have experience with Developing Pig scripts/Hive QL , UDF for analyzing all semi-structured/unstructured/structured data flows.

•             *Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.

•             *Experience with Spark and Scala, or some other JVM based language with data integration experience

•             *Must have working knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Azure) and considerations for scalable, distributed systems

•             *Must demonstrate Integration best practices with focus on Talend.

•             Must have extensive knowledge working with version control tools like GIT and SVN.

•             Hands on experience with PCF using Talend suite.

•             Experience implementing complex business rules in Talend by creating Reusable Transformations and robust mappings/mapplets. Experience in loading data, troubleshooting, Debugging and tuning of Talend mappings.

•             *Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Sorter, Normalizer, Sequence Generator, Router, Filter, Expression, Aggregator, Joiner, Rank, Update Strategy, Stored procedure, XML Source qualifier, Input and Output transformations.

*Expertise in Data Modeling concepts including Dimensional Modeling, Star and Snowflake schema, Experience in CDC and daily load strategies of Data warehouse and Data marts, slowly changing Dimensions (Type1, Type2, and Type3) and Surrogate Keys and Data warehouse concepts.

•             Hands-on experience in Performance tuning of Talend and Informatica ETL, Integration, Queries and Jobs.

•             Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production. Additionally, experience in design for sustainability – minimize impact on operations crews in the event of systems outages, unexpected data, etc., as well as engineering the code base for straightforward extensions / expansions.

•             *Must have working experience in the data warehousing and Business Intelligence systems. Additionally, experience in building in data quality analysis in-line in integration flows. Experience in working with metadata across the integration landscape in support of data governance and operational needs.

•             *Participate in design reviews, code reviews, unit testing and integration testing.

•             *Assume ownership and accountability for the assigned deliverables through all phases of the development lifecycle.

•             *SDLC Methodology (Agile / Scrum / Iterative Development).

•             *System performance management.

•             *Systems change / configuration management.

•             *Business requirements management.

•             *Problem solving /analytical thinking.

•             *Creative thinking.

•             *Ability to execute.

Company Information