- At least 3-5 years of experience with the Software Development Life Cycle (SDLC)
- At least 3-5 years of experience working on a big data platform (Hadoop, Spark, Impala, Nifi, Kafka, Zookeeper and scala)
- At least 2-3 years of experience with a Spark
- At least 3 years of experience working with unstructured datasets
- At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
- At least 1 year of Agile experience
- Good understanding of data lake fundamentals
- Technical proficiency with data integration and data lake design patterns
- Strong communication, collaboration, and multi-tasking abilities
- Experience with Agile methodologies
- Healthcare experience preferred
- Should have flexibility to work with offshore team
Required Qualifications (5 – 8 bullet points on must have skills)
Preferred Qualifications:
- At least 3-5 years of experience with the Software Development Life Cycle (SDLC)
- At least 3-5 years of experience working on a big data platform (Hadoop, Spark, Impala, Nifi, Kafka, Zookeeper and scala)
- At least 2-3 years of experience with a Spark
- At least 3 years of experience working with unstructured datasets
- At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
- At least 1 year of Agile experience