Required Skills

Azure DevOps

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Aug 2022

JOB DETAIL

 Bachelor’s Degree or Equivalent Experience

• Minimum 10+ years of IT Experience

• Minimum 6 years in implementing Big Data Solutions

• Proficiency in developing batch and streaming application using PySpark/Scala and Kafka

• Experience in using Azure Databricks Platform, Databricks Delta

• Experience in using different data services on Azure

• Exposure to Azure DevOps

• Proficient in Database Concepts and Technologies including MS SQL Server, DB2, Oracle, Cosmos DB, and No-SQL Databases

• Proficiency in file formats such as (but not limited to) Avro, Parquet, and JSON

• Familiar with Data Modeling, Data Architecture & Data Governance concepts

• Adept in designing and leveraging APIs including integrating to drive dynamic content

• Demonstrated problem-solving skills and the ability to work collaboratively with other stakeholders or team member to resolve issues

• Candidate should be able to lead cross functional Solutions

• Excellent Communications skills and should be able to effectively collaborate with the remote teams – both on-shore and off-shore

• Experience with working on Cloud implementations is a plus

• Healthcare background is a plus Responsibilities Key responsibilities of this role are: Build ETL processes to allow data to flow seamlessly from source to target using tools like DataBricks, Azure Data Factory SSIS. Load and enhance dimensional data models. Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholder like DataBricks, SQL, Scala, and Spark. Perform peer code reviews and QA. Fine tune existing code to make processes more efficient. Maintain and create documentation to describe our data management processes. a) Drive development & delivery of Key Business initiatives for the Big Data Platform in collaborating with other stakeholders. b) Collaborate with Business stakeholders in gathering Business requirements c) Perform POC’s on Big Data Platform to determine optimum solution. d) Work with vendors in evaluating Big Data Technologies and resolving Technical Issues e) Effectively collaborate with remote teams (on-shore and off-shore) for Solution Delivery

Company Information