Deploy (Dev, UAT & Prod) and manage and monitor data platforms and data movement solutions
Build scalability, fault-tolerance, security, and performance into our data platforms to meet the growing needs of our customers
Partner with agile data development teams and use DevOps methodologies to create automated, efficient CI/CD processes to reduce the time to promote, test and deploy code using GitLab
Partner with Information Security professionals to ensure data is secure both at-rest and in-flight
Participate in support rotations to help respond to infrastructure issues Requirements
Document and maintain key architecture and coding standards for supported platforms
Solid understanding of Infrastructure as Code, Linux, Docker and Clusters
Experience with monitoring tools such as Ambari and Ranger
Previous experience operating one or more of the following tools: NiFi, Hive, Spark, Sqoop and Yarn.
Ability to work autonomously with little supervision
Nice to have:
Understanding of native Hadoop echo system (Not Cloud based solutions)
Experience with Shell and Python programming languages
Understanding of best practices with regards to alerting and monitoring using as Ambari and Ranger