- Provide and develop production grade Talend Big Data jobs with considerations on meeting the IT organizations architecture standards
- Work closely with data modelers/ system analysts on the required data interface/ requirement specifications
- Conduct technical session/ clarification with data modelers and sub-system teams prior solution design, coding and unit testing
- Document and review design of Talend Big Data jobs and interface specification
- Implement error and exception handling
- Import jobs to Talend Data Catalogue
- Integrate Talend jobs with Autosys and restart failed jobs from Autosys
- Perform technical impact assessment, source code release and deployment checklist
- Work effectively with peers and vendors to develop, setup and support IFRS17 application and data integration
- Support SIT and UAT activities e.g. perform defect analysis, troubleshooting and fixing
- Coordinate and support Performance and Security Testing activities e.g. environment setup and test scope
- Coordinate with infrastructure team on deployment and related activities
Working Experience
i. Data Warehousing experience: 4 years minimum
ii. 3+ years working experience of Hadoop (Hortonworks) developer exp in Spark and Hive; especially Spark version 2
iii. Talend Big Data version 4+ Developer experience: having 2-3 years working exp in
deploying code to production
iv. 2+ years working exp in Talend Big Data version 7 on Hadoop
v. Experience designing Talend job orchestration through enterprise workload automation tool like Control-M; preferably Autosys.
vii. Have development experience using Java, PL/SQL, SQL, Python, Scala with good knowledge of data models and data flows
Perks and Benefits
- Exclusively onsite opportunity.
- Visa process and approval will be given to candidates.
- Flight tickets will also be provided
- Two weeks accommodation free