Roles and Responsibilities
We do not entertain contractual employees
Minimum- 5yrs exp is a must
Immediate joiners are preferable - 60% and above should be there in 10th, 12th and Graduation
JOB SUMMARY: Data Engineering Lead with 5yrs of strong experience, ready to work on implementing ETL/DWH solutions on AWS using ETL tools, AWS and open-source services. Will be part of a team requiring direct and close customer interactions through the full model life cycle.
KeyWords: ELT, ETL (Snowflake, Informatica), Dataiku, AWS, Data Engineering, Feature Engineering, Data pipeline, Data Warehouse, python
JOB OBJECTIVE: To be a driven business analyst who can work on complex Analytical problems and help the customer in better business decision making especially in the area of pharma/life sciences (domain) Global datasets.
Your Responsibilities includes but not limited.
- Technology Independently he/she should be able to design and lead implementation of complex Data Warehousing projects
- Requirements Gathering Be able to understand the requirements directly from customer or from project teams across pharma commercial data sets
- Technical Design and Development Independently he/she should be able to create ETL design, implement and deliver complex ETL code for simple to complex Data Modeling /Data Integration/ Data Warehousing project assignments using any of the listed ETL/ELT tools (StreamSets, Matillion, PySpark, snowflake Informatica, Glue).
- Expert knowledge of SQL with the capability to do performance tune complex SQL queries along with deep understanding of different data partition strategies
- Develop ETL / ELT code using project tools
- Create scripts using python libraries to support the transformation capabilities not directly available with the tools.
- Be able to troubleshoot L2/L3 issues across customer implementations and come up with fixes in a time bound manner
- Task Management Should be able to drive the day-to-day tasks in alignment with the project plan and guide the team to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items in an onshore-offshore model.
- Logical Thinking Able to think analytically, use a systematic and logical approach to analyse data, problems, and situations.
- Handle Client Relationship Manage client communication and client expectations independently or with support of reporting manager. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills.
- Communication Able to convey ideas and information clearly and accurately to self or others whether in writing or verbal.
Must have Skills:
- 5-10 years experience in cloud ELT/ETL tool -Informatica, snowflake and Dataiku is a must
- Strong knowledge and experience in AWS technologies. Worked with Redshift/Synapse/Snowflake in the past
- Used any of python framework Airflow/Luigi/Dagster/Apache Beam or Pyspark.
- Experienced in running an end-to-end ETL project interacting with users globally and have done ETL development using any python framework.
- Hands on into development using python. Advanced knowledge of python, pandas, numpy.
- Has good knowledge of DW architectural principles and ETL mapping, transformation, workflow designing, batch script development.
- DW/BI experience with implementation of ETL solutions using Informatica and Oracle, Proficient with crafting Informatica mappings, sessions, workflows, mapplets and hands on Experience on crafting CDC, SCD2 implementation
- Has worked on integration technologies like Hive, Airflow, python
- Experienced in running an end-to-end ETL project interacting with users globally
- Experience in developing complex ETL solution
- Must have experience in writing SQL in Oracle. Should be able to write and understand queries
Qualifications we seek in you.