- Experience as an enterprise Data Engineer from a consulting background
- Proven experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
- 3+ years experience in Python programming language
- Software development experience working with Apache Airflow, Spark, MongoDB, MySQL
- Strong capacity to manage numerous projects are a must
- Experience using Docker or Kubernetes is a plus
- BS/MS degree in Computer Science or equivalent industry experience
- Ability to identify and resolve problems associated with production grade large scale data processing workflows
- Excellent communication skills (we’re a geographically distributed team)
- Experience creating and maintaining unit tests and continuous integration.
- Passion for creating Intelligent data pipelines that customers love to use.
Special Consideration given for
- Experience & knowledge with Web Analytics or Digital Marketing
- Experience & knowledge with Customer Data Platform (CDP) or Data Management Platform (DMP)
Experience & knowledge with Client Experience Cloud solutions