Required Skills

Data Scientist

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 28th Nov 2025

JOB DETAIL

Whats your opinion on Data bricks vs AWS pros and cons

About The Team

Data pipeline and ETL tooling are some of the most important components required to make ML successful.  Unification of these tools enables us to simplify our development and invest all effort and enhancements into a single platform.  Our ML tooling must enable ML applications to be easily built, maintained, deployed and validated.  By platformizing our ML pipeline tools, Workday can build many high value ML solutions very quickly.  Today, ML productivity can play a huge role in product success - this investment into improving data pipeline and ETL tooling is exactly what our team does.

We strive to provide industry-leading solutions to ML and data engineering problems for Workday developers and Workday customers.  We are made up of an eclectic group of dedicated individuals who pull experience from almost every corner of Workday and some of today’s top tech companies.  We take great pride in being relentlessly innovative, intensely encouraging, and persistently positive.

As a Software Development Engineer in ML, you will:

  • Transition Databricks pipelines to Workday Data Engineering tooling on AWS
  • Transition Databricks notebooks to AWS SageMaker Studio
  • Develop features for Workday Data Engineering tooling that promote data mesh architecture principles
  • Lead teams through best practices in ML and ML Ops 
  • Influence the direction of our product vision and strategy with technical expertise and context
  • Provide critical feedback for the team’s technical designs, architecture, and decisions

About You

Basic Qualifications:

  • 2 or more years of experience using Databricks
  • 2 or more years of experience using ML pipeline tools: Airflow, Dagster, ML Flow, AWS Glue, Spark, etc.
  • 2 or more years of experience using cloud compute technologies: AWS, GCP, etc.
  • 5 or more years of experience building production grade software and practicing Agile methodologies

Other Qualifications:

  • 3 or more years of experience building data mesh architectures
  • 3 or more years of experience with ML ops
  • 3 or more years of experience with machine learning and data science technologies

Company Information