-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 13th May 2022
- Hand-on working experience on Spark , Python ,AWS ,S3, snowflake, Unix shell scripting and Hadoop
- Develop ,Execute tests, collect and analyse data, identify defects and fix
- Understand the SDLC and create relevant documents (i.e. peer review deployment documents ) as per customer expectation
- Track and report and impediments
- Expertise in data analysis
- Proficient in AWS ,S3, Hadoop ecosystem ,Hive ,Unix Shall scripting , Python, Airflow
- Exposure to Data warehousing Concepts and SQL
- Exposure to Delta lake will be an advantage
- Agile (scrum) experience in live projects
- Good knowledge of cloud technologies from the perspective of Analytical Solution
- Good knowledge of AWS storage