US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 6th Jan 2026
Connect and Collect data exploration and preparation.
Transform and Enrich data representation and transformation.
Publish and Serve Publish data.
Monitor data pipelines and solutions.
Developing, building, maintaining, and managing data pipelines work with large dataset, databases and software used to analyze them.
Fluent with Azure data tools and component.
Primary focus to ensure that data flows smoothly from its source to its destination efficiently and securely.
Demonstrate experience in programming scripting (e.g. Python, SQL).
Demonstrate experience Analytics/data product solution architecture on Azure (Azure data Factory, Data bricks).
Handson experience in data extract, load and transformation techniques and tools including orchestration of needed azure resources.
Ensuring the accuracy of data and promoting data quality.
Building, testing, and maintaining database pipeline Architecture.
Demonstrated experience in Data and Software Engineering tools and Agile methodologies (e.g. Agile, ADO,Ansible,G/T)
Demonstrated experience in Analytica Data framework and Azure Analysis Services. Skills:
Modern Data Warehousing
Data flow transformations
Implement and design data engineering and integration patterns: messaging, shared databases, file, service based.
Data security design authorization, sharding, allocations, authentication
Database systems and large scale processing systems
Consume and develop data driven APIs.
Technologies
Azure Data Factory
Azure Synapse
Azure Data Lake
Data bricks
Database (SQL, NoSQL, Cosmos DB)
Data Flow Transformations via SQL, Python
Business Intelligence (Power BI, Spotfire, Power Apps)
SAP BW
Duties examples
Build scalable data pipelines to enable data driven work.
Design and architect data integration patterns and styles.
Establish fit for purpose guardrails for data ingestions, transformations.
Design and develop reusable data engineering patterns from simple ETL to complex modern data warehousing involving multiple endpoints and data sources. Leverage existing frameworks and accelerators.