Required Skills

data architect Data Warehouse Architect Data Architect

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Sep 2022

JOB DETAIL

Data Architect Job Description

As part of the company’s growing Business Intelligence team, we’re looking for someone to design, develop and implement technology and data solutions to meet current and future data warehousing and reporting needs. An experienced engineer who can set the technical direction, establish best practices and partner with all consumers of data to build and support the company's data warehouse and the ETL platform.

Responsibilities for DW Architect

  • Design and implement relational and dimensional models in Snowflake and related tools.
  • Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.
  • Collaborate with product and other engineering teams on normalizing and aggregating large data sets based on business needs and requirements.
  • Work on different databases (Snowflake, Postgres, MySQL, NoSql)
  • Work with different data formats (JSON, CSV, Parquet, Avro) and interact with various on-premise and cloud data sources as well as RESTful APIs.
  • Establish, maintain and administer DW clusters and other data infrastructure
  • Implement systems for tracking data quality and consistency
  • Coach and mentor team members and promote team development and growth

Qualifications for DW Architect

  • 5+ years of experience in a data warehousing or data engineering role
  • Must have 2+ years of experience in latest data warehouse systems
  • Experience in having designed architecture and implemented the same
  • Very strong SQL skills and experience
  • Knowledge and expertise in system performance and optimization, schema design, and, capacity planning
  • Proven ability to design, develop, and maintain data warehousing and ETL workflows for large data sets and interact with various sources
  • Knowledge of industry standard ETL and workflow tools (Airflow or other 3rd party), as well as writing your own utilizing Python is preferred
  • Working experience with Azure or Amazon Web Services (S3, EC2, Lambda, Data Pipeline)
  • Hands-on experience and expertise in using advanced Snowflake//MySql/Postgres SQL features, specifically analytical functions.
  • Strong programming and scripting skills (Python experience is a plus)

Even better if you have:

  • Bachelor’s degree or equivalent in Computer Science or Engineering preferred
  • Experience developing microservices in Flask or Django are highly preferred
  • Data Science or Machine Learning experience and skills are highly preferred

Company Information