Required Skills

Python AWS Pyspark

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 3rd Apr 2024

JOB DETAIL

  • Looking for Data expert with Advisory, consulting , GenAI, AI/ML , Data Bricks, Python and Pyspark experience . Senior data engineering leader with industry knowledge and strong understanding of GenAI.
  • The job involves Advisory and consulting, thought leadership, interact with customer through workshops to understand pain points/business challenges & provide Data, Analytics and Visualization solutions.
  • Must have experience and proficiency in all aspects of data management, data & analytics solution architecture & design, implementation roadmap. Must be hands on in solutioning.
  • Large extent of work will be in data management & governance. 
  • The candidate is also expected to have good understanding & knowledge in all kinds of analytics- diagnostic, descriptive, predictive, prescriptive & cognitive (AI/ML as well as GenAI).
  • Must be experienced in either AWS or Azure Data, Analytics, Visualization stack.  If the candidate is experienced in AWS then expected to be proficient in Azure, vice versa
  • Architect and design Unified Data & Analytics solution on AWS/Azure, Snowflake / Databricks, Map tools, technology, and solution components against business capability requirements
  • Analyze, understand, and capture business capability requirements
  • Analyze, understand, capture, and document the capabilities that are available internally in one form or another to create data fabric architecture
  • Analyze, understand the LoBs, Business systems, processes, teams & their roles and responsibilities, data & Analytical products that they create; to create data mesh architecture to produce differing combinatorial data products relative to business context from those assets
  • Define various data sharing and consumption patterns and implementation methodology using REST API, Analytics workspace, self-service, file share etc.
  • Define framework, guiding principles, best practices, and reusable components for data in rest, data in motion, Error control, process control, audit, reconciliation, orchestration and so on
  • Define testing strategy and reusable components
  • Create implementation roadmap,  periodic review to ensure team follows the architecture and principles + Hand hold and guide the team
  • Must be proficient in all aspects of Asset management, Asset servicing business domain
  • Must have implemented at least 2 or 3 data management solutions in Investment management / Portfolio management / Fund management, end to end
  • Mentor and guide team on data management solutions and implementation
  • Candidate is expected to manage the implementation of the solution, following the agile methodologies
  • Skill set:  AWS / Azure data & analytics full stack along with Snowflake / Databricks Delta lake.

Company Information