Required Skills

PySpark Python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 22nd Aug 2022

JOB DETAIL

Job Description
•    8+ years implementing data pipelines or data-intensive assets using Python, Java, or Scala
•    5+ years using distributed data processing engines such as Apache Spark, Hive
•    2+ years creating modular data transformation using an orchestration engine like airflow or equivalent such as Nifi
•    4+ years building cloud-native solutions in AWS, especially with s3, Glue, Lambda, Step functions, EMR, EC2, or Azure
•    Experience creating re-distributable and portable data assets using containers or cloud-native services
•    Hands-on experience building decision support systems or advanced analytics solutions
•    Able to architect and own the execution of an end-to-end technical Workstream.
•    Experience designing and implementing REST APIs is a plus
•    Expert understanding and familiarity with Continuous Delivery practices
•    Experience delivering solutions through an Agile delivery methodology
•    Ability to understand complex systems and solve challenging analytical problems
•    Comfort with ambiguity and rapid changes common in early-stage product development
 
Responsibilities
•    Healthcare domain knowledge is not required, but interest in healthcare is appreciated WHAT YOU'LL DO You will deliver software products with high levels of value, usability, quality and predictability for our engagements and help our clients transform social, healthcare, and public entities across the world.
•    In this role, you will work with product managers and other engineers and be a key contributor in delivering incremental solutions in an Agile delivery manner.
•    You will help design, develop, test, and maintain software solutions that are leveraged within the SHaPE Analytics group. You will help build out data frameworks that will allow our analytics to run at scale.
•    You will look for opportunities to continuously improve the solutions and help in the resolution of issues and defects that may occur within the environment.
•    You will be an active contributor to improving the code quality and test automation of our products, leaning on current automated test frameworks and best practices.
•    In order to be successful, you will be an active learner and contributor to the team. It is expected that your voice and opinion be heard and expressed.
•    We have an environment that thrives on the fact that people’s experiences and skill set variation create better solutions. Lastly, you will contribute code and participate in code reviews; you will break down user stories into technical tasks and requirements; and you will identify, communicate, and escalate risks when appropriate.

Company Information