This is a partnership-heavy role. As a member of the Data team, you will enable various functions of the company, i.e. Product, Engineering, Go-to-Market, etc.,, to be data-driven As a Data Engineer, you will take on big data challenges in an agile way. You will build data pipelines that enable data scientists, operation teams, and executives to make data accessible to the entire company. You will also build data models to deliver insightful analytics while ensuring the highest standard in data integrity. You are encouraged to think out of the box and play with the latest technologies while exploring their limits. Successful candidates will have strong technical capabilities, a can-do attitude, and are highly collaborative.
Job Responsibilities:
	- Design, build and launch extremely efficient and reliable data pipelines to move data across a number of platforms including Data Warehouse and real-time systems.
	- Developing strong subject matter expertise and managing the SLAs for those data pipelines.
	- Set up and improve BI tooling and platforms to help the team create dynamic tools and reporting.
	- Partnering with Data Scientists and business partners to develop internal data products to improve operational efficiencies organizationally.
	- Here are some examples of our work:
	- Data Pipelines - Create new pipelines or rewrite existing pipelines using SQL, Python on Airflow DBT
	- Data Quality and Anomaly Detection - Improve existing tools to detect anomalies real time and through offline metrics
	- Data Modeling - Partner with analytic consumers to improve existing datasets and build new ones
Job Qualifications:
	- 4 to 6 years of experience in a Data Engineering role, with a focus on data warehouse technologies, data pipelines and BI tooling.
	- Bachelor or advanced degree in Computer Science, Mathematics, Statistics, Engineering, or related technical discipline
	- Expert knowledge of SQL and of relational cloud database systems and concepts.
	- Strong knowledge of data architectures and data modeling and data infrastructure ecosystem.
	- Experience with enterprise business systems such as Salesforce, Marketo, Zendesk, Clari, Anaplan, etc.
	- Experience with ETL pipeline tools like Airflow, DBT, and with code version control systems like Git.
	- The ability to communicate cross-functionally, derive requirements and architect shared datasets; ability to synthesize, simplify and explain complex problems to different types of audiences, including executives.
	- The ability to thrive in a dynamic environment. That means being flexible and willing to jump in and do whatever it takes to be successful.
Nice to haves
	- Experience with Apache Kafka
	- Knowledge of batch and streaming data architectures
	- Product mindset to understand business needs, and come up with scalable engineering solutions