Required Skills

AWS Spark Scala Databricks

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Jan 2024

JOB DETAIL

•     Seeking a highly skilled Data Engineer with 10 years of experience in. Must have expertise in Spark and Scala.

•     5+ Years of IT experience in AWS Cloud, Databricks,AWS S3 , Spark/Scala ,Big Data tool and technologies

•     Build Data Mapping documents before coding with provided requirements

•     Excellent knowledge in writing SQL queries for any given problem statement

•     Programming knowledge in AWS Cloud,AWS Glue ETL,Scala, Java or Python (preferable Scala)

•     Good understanding of Spark concepts – RDD/Dataframes/Datasets/Streaming and experience with coding in Scala is preferred

•     Need to have a good understanding of cloud terminologies in AWS and have a good grasp of all basic AWS services (eg. S3/IAM etc.)

 

Roles & Responsibilities:

•     Perform Data Analysis understanding the existing business requirements

•     Design and build scalable data pipelines using Big Data stack Databricks, Spark Scala, AWS – Athena, S3, Glue ETL, Redshift, Dynamo DB

•     Design Ingress and Egress patterns for data lake to support self-service use cases and improve time to business value.

•     Implement CI/CD data pipelines using Git Hub, Jenkins and Artifactory to automate the code build and promotion process.

•     Engage with customers and principal architects, understand the business requirements and build AWS cloud big data platform and solutions.

•     Create proof of concepts, evaluate new products and develop a roadmap for NextGen Technology and Digital Cloud First strategy

•     Jobs Orchestration

•     Ability to work independently and guide other new team members end-to-end from Development to unit testing to UAT and Prod Deployment/cut over ensuring quality

•     Any experience with AWS Cloud,AWS Glue ETL,Databricks Delta/Redshift/ Talend Big Data would be added advantage

•     Good written and oral communication skills

•     Ability to work independently in agile scrum methodology.

 

Company Information