Required Skills

Data Analytics

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Jan 2024

JOB DETAIL

Major Responsibilities:

  • Work on Finance data related to Collaterals, ETD, OTD, Settlement market, Cash product, Repo, Duos repo
  • Design, develop, and deploy Databricks jobs to process and analyze large volumes of data. 
  • Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.   
  • Optimize Databricks jobs for performance and scalability to handle big data workloads.          
  • Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.       
  • Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Demonstrated proficiency with Data Analytics, Data Insights
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process
  • Azure Synapse/Bigquery/Redshift is good to have.
  • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.

 

Skills:

  • 5+ years’ – Strong experience in Finance / Banking industry – Capital markets, investment banking - Collaterals, ETD, OTD, Settlement market, Cash product, Repo, Duos repo
  • 10+ years - Enterprise Data Management
  • 10+ years - SQL Server based development of large datasets
  • 5+ years with Data Warehouse Architecture, hands-on experience with Databricks platform. Extensive experience in PySpark coding.
  • Azure Synapse/Bigquery/Redshift experience is good to have
  • 3+ years Python(numpy, pandas) coding experience
  • Experience with Snowflake utilities such as SnowSQL and SnowPipe  - good to have
  • Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
  • Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills
  • Capable of discussing enterprise level services independent of technology stack
  • Experience with Cloud based data architectures, messaging, and analytics
  • Superior communication skills
  • Cloud certification(s)
  • Any experience with Regulatory Reporting is a Plus

Education

  • Minimally a BA degree within an engineering and/or computer science discipline
  • Master’s degree strongly preferred

 

Company Information