Job responsibilities:
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems.
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development.
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
- Contributes to software engineering communities of practice and events that explore new and emerging technologies.
- Adds to team culture of diversity, equity, inclusion, and respect.
Required qualifications, capabilities, and skills:
- Formal training or certification on software engineering concepts and 3+ years of applied experience
- Hands-on practical experience delivering system design, application development, testing, and operational stability.
- Proficient in one or more programming language(s).
- Proficiency in automation and continuous delivery methods.
- Proficient in all aspects of the Software Development Life Cycle.
- Understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
- Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
- Practical cloud native experience.
- Ability to develop reports, dashboards, and processes to continuously monitor data quality and integrity.
- Working knowledge of bitbucket and JIRA.
- Preferred qualifications, capabilities, and skills.
- Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR.
- Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark.
- Strong hands-on experience building event driven architecture using Kafka.
- Experience writing Splunk or Cloudwatch queries, DataDog metrics.