We are looking for a Developer/Senior Developer to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions across standard chartered Retail Banking Foundation program.
Requirements
- The candidate must be a self-starter, who can work under general guidelines in a fast-spaced
- environment.
- Overall minimum of 5 to 12 year of software development experience and 2 years in DataWarehousing domain knowledge
- Must have at least 3 years of hands-on working knowledge on Big Data technologies such as Pyspark, Hive, Hadoop, Hbase, Spark, Nifi, SCALA, Kafka
- Excellent knowledge in SQL & Linux Shell scripting
- Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
- Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
- Ability to manage a diverse and challenging stakeholder community
- Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.
Responsibilities
- Responsible for the documentation, design, development, and architecture of Hadoop
- applications
- Converting hard and complex techniques as well as functional requirements into the detailed designs
- Adhere to SCRUM timeline and deliver accordingly
- Prepare Unit/SIT/UAT testcase and log the results
- Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
- Drive small projects individually.
- Co-ordinate change and deployment in time