-
US Citizen
-
Green Card
-
EAD (OPT/CPT/GC/H4)
-
H1B Work Permit
-
Corp-Corp
-
W2-Permanent
-
W2-Contract
-
Contract to Hire
-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 30th Jun 2025
- Design and build metadata driven data pipelines using tools like Azure Data Factory (ADF) and Informatica.
- Develop and optimize Operational Data Stores (ODS) leveraging Azure Data Lake.
- Implement and manage data solutions on Azure, ensuring efficient cloud resource utilization and cost optimization.
- Use Azure Functions for data processing automation and orchestration.
- Work with Guidewire Data and ensure seamless integration and processing.
- Write robust and scalable code using Python, T SQL, and Spark to support custom data transformation processes.
- Integrate and process data from diverse sources into Azure Data Lake and SQL Server.
- Knowledge of Hadoop is a plus for handling large scale data processing and storage needs.
- Utilize prior Property and Casualty (P&C) insurance domain experience to align technical solutions with business requirements.
- Collaborate with stakeholders to gather requirements, define data strategies, and translate business goals into technical implementations.
- Provide clear and effective communication across technical and non technical teams.