-
Corp-Corp
-
W2-Permanent
-
W2-Contract
-
Contract to Hire
-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 3rd Dec 2024
- Bachelor's degree in Computer science or equivalent, with minimum 10+Years of relevant experience.
- Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience).
- 5+ years of experience with Dataflow for designing scalable data processing pipelines.
- Proficiency in Java for application development and data handling.
- Experience with Informatica for data integration, ETL processes, and data management.
- Strong problem-solving skills with the ability to optimize complex data workflows.
- Knowledge of cloud platforms (e.g., Google Cloud, AWS, or Azure) is a plus.
- Excellent communication skills to collaborate with team members and stakeholders effectively.
- Responsibilities:
- Develop and maintain data pipelines using Dataflow to support real-time and batch processing.
- Design, implement, and optimize code in Java to enhance data processing capabilities and system performance.
- Integrate, transform, and manage data using Informatica tools to ensure data quality and consistency.
- Collaborate with cross-functional teams to gather requirements, develop solutions, and support data-driven decision-making.
- Troubleshoot and resolve performance issues across the data pipeline and integration processes.
- Ensure best practices in data security and governance are followed throughout the data architecture.
- Document technical solutions, processes, and procedures for effective knowledge transfer and future maintenance