-
US Citizen
-
Green Card
-
EAD (OPT/CPT/GC/H4)
-
H1B Work Permit
-
Corp-Corp
-
W2-Permanent
-
W2-Contract
-
Contract to Hire
-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 6th Sep 2024
- Develop and execute POC's to demonstrate the capabilities of the existing internal tool suite.
- Analyze and compare the potential of continuing with the current tools versus investing in DataBricks, providing well-informed recommendations.
- Lead the transformation of Redshift/Stored Procedure data applications, migrating them to a new Glue/Iceberg framework as part of this comparison.
- Explore the possibilities of using MS Fabric for data visualization and integration, identifying any added value it could bring.
- Document findings, methodologies, and recommendations clearly and comprehensively
Qualifications:
- DataBricks: Proficiency in using DataBricks for data engineering, analytics, and machine learning is a plus
- DevOps: Proficiency with GitHub Actions, AWS CDK, and CICD pipeline development.
- SQL/PLSQL: Strong skills in reading, understanding, and working with SQL and PLSQL code.
- Python Development: Experience in Python programming, particularly in the context of data processing.
- AWS Development: Solid experience with AWS services, especially in data-centric projects.
- PySpark/Glue API: Hands-on experience with PySpark and/or AWS Glue API development.
- Documentation/Communication: Excellent ability to document processes and communicate complex concepts effectively.
- MS Fabric: Experience with MS Fabric for enhancing data visualization and integration.