JD for Big Data Engineer
We are looking for a Developer/Senior Developer to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing, and challenging work environment, and will play an important role in resolving and influencing high-level decisions across standard chartered Retail Banking Foundation program.
Requirements:
- The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
- Overall minimum of 3 to 9 year of software development experience and 2 years in Data Warehousing domain knowledge
- Must have at least 3 years of hands-on working knowledge on Big Data technologies such as Hive, Hadoop, HBase, Spark, Nifi, SCALA, Kafka
- Excellent knowledge in SQL & Linux Shell scripting
- Bachelors/Masters/Engineering Degree from a well-reputed university.
- Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
- Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
- Ability to manage a diverse and challenging stakeholder community
- Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.
Responsibilities
- Responsible for the documentation, design, development, and architecture of Hadoop applications
- Converting hard and complex techniques as well as functional requirements into the detailed designs
- Work as a senior developer/individual contributor based on situations
- Bigdata developer to SCRUM timeline and deliver accordingly
- Prepare Unit/SIT/UAT testcase and log the results
- Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
- Drive small projects individually.
- Co-ordinate change and deployment in time
In case of interest share your updated CV on Umeraai@maveric-systems.com