Required Skills

Python Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 14th Jun 2022

JOB DETAIL

Top Skills

1) AWS EMR (Architecture and configuring)

2) Python and PySpark

3) Building Microservices

4) API Tools, Apigee, AWS API Gateway, API Security, API Framework, etc

5) SQL/Data

 

Will be guiding and mentoring and working much like a Tech Lead..

****SAS programming not required but most be able to "run" SAS code to be able to get data and results to compare once migrated to AWS cloud

****  Desktop as a Service

Hope all Credit Expense forecast team is looking for a PySpark developer on EMR/AWS to migrate SAS models into AWS.

 

Description and Skill Set:

1.            Work with business and technical leads on the Transformation of critical Credit Risk Forecasting applications to Microservices AWS EMR Architecture leveraging PySpark.

2.            Provide guidance on  Application Architecture, design and implementation for forecast process for credit loss and home price for  Fortune 100 client. 

3.            Work with, lead and mentor the business and technology teams.  Build a solution based on cloud, microservices API's.  Decompose legacy monolithic systems into Microservices and API's Running in the cloud.

4.            The lead developer is expected to be hands on with converting SAS Code to PySpark ensuring quality and maintainability.

5.            Well versed in EMR PySpark Performance tuning leverage EMR Architecture (Master, Cluster, memory utilization etc)

6.            Understanding of AWS computing and data architecture to create a sound and high performing implementation for Credit Expense Forecasting Process.

7.            Experience with Parallel programming with Big Data, ETL, and forecasting models

8.            Designing API layer using tools like Apigee, AWS API Gateway, API Security, API Framework,etc

9.            At  least  5+ years of experience working in a PySpark/EMR environment

10.          Proficiency with one or more programming languages (Java, Python, PySpark preferred)

11.          Proficiency in SQL, relational and non-relational databases, query optimization, and data modeling.

12.          Experience working in a Linux-based environment.

Experience working on teams operating under an Agile Scrum delivery methodology

 

Company Information