Citizen
Full Time
Direct Hire
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 14th Jun 2022
Scope:
Core responsibilities to include data engineering in Azure Data Platform Domain. Solid understanding of data ingestions data transformation using big data systems is essential.
The team currently comprises of 30 + global associates across US, India (COE ) . We are part of corporate engineering team who leverage cutting edge technologies such as AI/ML, RPA, Low Code Bigdata to deliver innovative solutions for BYonders . The incumbent should be a quick learner and adaptable who is willing experiment and learn quickly from failure.
Our C urrent T echnical E nvironment:
Software: JavaScript, Java, C#, GIT, Rest API, OAuth, Node.JS, Linux,
Application Architecture : Scalable, Resilient, event driven, secure cloud native architecture
Cloud Architecture : MS Azure (ARM templates, Application Service, Serverless, SQL Server, Data Bricks, ESB, Logic Apps, Azure Search, Key Vault, Azure AD)
Frameworks/Others : Open Source, App Smith, Elastic Search, Event Driven Design, Composable Architecture, Enterprise Data Platform , Informatica (ICS, ICRT, MDM)
What Y ou ll D o Here :
Collaborate with experienced IT engineers to ingest and prep data to deliver business intelligence for BYonders to take data driven decisions.
Deliver proof of concerns for business problems by leveraging the current technology stack and open source technologies
Work in an agile environment to deliver high-quality software.
Be a key contributor to Enterprise Data Platform Operations and Maintenance
What W e A re L ooking F or:
4 to 6 years experience with Big Data Systems on Cloud esp. Azure
Should have hands-on experience in Azure Data Platform Services like Azure Data Factory, Azure Databricks, Azure Data Lake, Key Vault, Azure Synapse Analytics, Azure SQL Server
Should have hand-on experience in Python, PySpark Spark SQL
Should be able to develop reports and dashboards using Microsoft Power BI
Should be able to implement robust, re-usable and scalable data pipelines using Azure Data Factory
Should be able to create consume data services using REST API
Ability to create reusable components for rapid development of data platform
Should have strong Analytical communication skills
Have experience in working within Scrum teams
Ability to quickly resolve issues and refactor code into reusable libraries, APIs, and tools
Nice to have:
Understanding of infrastructure (including hosting, container-based deployments, and storage architectures) would be advantageous
Desirable Certifications (at least 1 in preference of order below):
Microsoft Certified: Azure Data Engineer Associate
Microsoft Certified: Data Analyst Associate
Analyzing and Visualizing Data with Microsoft Power BI