Required Skills

ETL

Work Authorization

  • US Citizen

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 12th May 2025

JOB DETAIL

  • Need someone with 9-10 years of experience 
  • Strong hands M-on coding skills in Python.
  • Extensive hands-on experience with Databricks for developing integration layer solutions
  • AWS Data Engineer or Machine Learning certification or equivalent hands-on experience with AWS Cloud services.
  • Proficiency in building data frameworks on AWS, including hands-on experience with tools like AWS Lambda, AWS Glue, AWS SageMaker, and AWS Redshift.
  • Hands-on experience with cloud-based data warehousing and transformation tools such as Delta Lake Tables, DBT, and Fivetran.
  • Familiarity with machine learning and open-source machine learning ecosystems.
  • Hands-on experience with integration tools and frameworks such as Apache Camel and MuleSoft.
  • Solid understanding of API design principles, RESTful services, and message queuing technologies.
  •  Familiarity with database systems and SQL.
  •  Hands-on experience with Infrastructure as Code (IaC) tools like Terraform and AWS CloudFormation.
  • Proficiency in setting up and managing Databricks workspaces, including VPC management, security groups, and VPC peering.
  • Hands-on experience with CI/CD pipeline management using tools like AWS CodePipeline, Jenkins, or GitHub Actions.
  • Knowledge of monitoring and logging tools such as Amazon CloudWatch, Datadog, or Prometheus.
  • Hands-on experience with data ingestion and ETL processes using AWS Glue, Databricks Auto Loader, and Informatica.

Company Information