Designing and developing strategic technology solutions on Hadoop based on business requirements
Collaboration within the organization on current and future state architecture
Hands-on development working within a global development team
Monitoring platform processes and activities across various analytics teams to ensure platform stability
Coordination with our DevOps teams, infrastructure engineers, and production support to triage and resolve issues and efficiently manage resource capacity of our Big Data platform
Skills Required:
7-10 years of appropriate technical experience
Proficiency with developing software with Scala and Python
Experience with Spark for data processing
Scala with Spark
Financial experience
Experience maintaining Cloudera Hadoop infrastructure such as HDFS, YARN, Spark, Impala and edge nodes
High quality software architecture and design methodologies and patterns
Strong SQL skills with commensurate experience in a large database platform
Complete SDLC process and Agile Methodology (Scrum)
Strong oral and written communication skills
Experience with Core Java
Experience with XML/JSON technologies
Experience with Data Federation or Virtualization technologies
Experience with developing Cloud-based Big Data solutions
Experience with tools / platforms / technologies like Dataiku, ElasticSearch, etc
Experience in development on other application types (Web applications, batch, or streaming)