Required Skills

Big data

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 14th Nov 2022

JOB DETAIL

Mandatory :

•             10 Years

•             Bigdata with Azure

•             Scala

•             Kafka

 

Summary : This role will be part of the Data Exchange group and will report to Software Engineering Senior Manager. It will require cross coordination between multiple teams. This role will be a key player in defining and implementing Big Data Strategy for the organization along with driving implementation of IT solutions for the business. This role will also provide direction for implementing best practices to determine optimum solutions. Minimum Education, Licensure and Professional Certification requirement:

 

Bachelor’s Degree or equivalent Experience Minimum Experience required (number of years necessary to perform role): 10+ years IX. Required Skills/Qualifications:

 

  • Bachelor’s Degree or Equivalent Experience
  • Minimum 10 years of IT Experience
  • Minimum 3 years in implementing Big Data Solutions
  • Proficiency in developing batch and streaming application using PySpark/Scala and Kafka
  • At least 3 years’ experience with working on Cloud implementations required
  • At least 2 years’ experience in using Azure Databricks Platform, Databricks Delta
  • Candidate must be able to lead cross functional Solutions
  • Experience in using different data services on Azure
  • Proficient in Database Concepts and Technologies including MS SQL Server, DB2, Oracle, Cosmos DB, and No-SQL Databases
  • Proficiency in file formats such as (but not limited to) Avro, Parquet, and JSON
  • Familiar with Data Modeling, Data Architecture & Data Governance concepts
  • Adept in designing and leveraging APIs including integrating to drive dynamic content
  • Exposure to at least one: Azure DevOps, AWS, or Google Cloud
  • Demonstrated problem-solving skills and the ability to work collaboratively with other stakeholders or team member to resolve issues
  • Excellent Communications skills and should be able to effectively collaborate with the remote teams – both on-shore and off-shore
  • Healthcare or Financial background is a plus Job Description Overview In this role, you will be responsible for full life cycle solutions, from conception through deployment, for data solutions.
  • Improve coding quality and reliability by implementing good standards and processes. Increase productivity by implementing tools and processes.
  • Serve as the technology go-to person on any technical questions. Resolve complex technical issues.
  • Ensure quality is maintained by following development patterns and standards.
  • Prepare deployment and post-deployment plans to support the conversion and deployment of the solution.
  • Interact with architects, technical project managers, developers to ensure that solutions meet requirements and customer needs.
  • Improve coding quality and reliability by implementing good standards and processes (best practices). Increase productivity by implementing tools and processes.
  • Serve as the technology go-to person on any technical questions. Resolve complex technical issues.
  • Ensure quality is maintained by following development patterns and standards.

 

Responsibilities

  • Build ETL processes to allow data to flow seamlessly from source to target using tools like DataBricks, Azure Data Factory SSIS. Load and enhance dimensional data models.
  • Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholder like DataBricks, SQL, Scala, and Spark.
  • Perform peer code reviews and QA. Fine tune existing code to make processes more efficient.
  • Maintain and create documentation to describe our data management processes.

a) Drive development & delivery of Key Business initiatives for the Big Data Platform in collaborating with other stakeholders.

b) Collaborate with Business stakeholders in gathering Business requirements

c) Perform POC’s on Big Data Platform to determine optimum solution.

d) Work with vendors in evaluating Big Data Technologies and resolving Technical Issues

e) Effectively collaborate with remote teams (on-shore and off-shore) for Solution Delivery

Company Information