-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 4th Nov 2022
- Work with stake holders to gather requirements
- Create new workflows, pipelines per needs
- Troubleshoot existing workflows, pipelines to map the requirement from stake holders
- Participate in planning meetings
- Available for weekend/ off hours downtime to move changes to PROD
- Collaborate with different teams to get the changes aligned to avoid schedule overlapping
Required Skills:
- 5+ years of hands-on experience using DataStage version 9 or higher (DataStage 11.7.X experience is preferred)
- 2+ year of hands-on experience using SAP Data services in building data integrations.
- Solid practical knowledge in collecting data from API end points like JIRA, ServiceNow.
- Strong ETL experience in handling large volumes of data in the complex heterogeneous data warehouses, and processing high volume jobs
- Prior experience in building data ingestion pipeline and data replication into cloud environments hosted on Azure platform and Azure Databricks
- Hands on experience in creating the ETL jobs, performance tuning / optimization
- Strong Knowledge in databases such as Oracle, IBM DB2, SAP HANA, and OLAP data warehouses
Good to have:
- Experience working with Snowflake
- Experience in SQL, UNIX shell scripting
- Experience working with Tivoli Workflow Scheduler (TWS)/ Maestro and BMC Control-M