- The candidate will be responsible for Design, build and maintenance of security-enabled Big Data workflows/pipelines to process billions of records into and out of our Hadoop Distributed File System (HDFS).
- The candidate will engage in sprint requirements and design discussions.
- The candidate will be proactive in troubleshooting and resolving data processing issues.
- The candidate should be highly accountable, self-starter; possess strong sense of urgency; can work autonomously with limited direction.?
Skills Requirements:
FOUNDATION FOR SUCCESS(Basic Qualifications)
- Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.
- Masters or Doctorate degree may substitute for required experience
- 7+ years experience in Hadoop, bigdata, kafka, data analysis.
- Must be able to obtain and maintain a Public Trust. Contract requirement.