Following are some of the responsibilities an enterprise data modeler is expected to assume. At high level , it’s not the traditional relational data modeler person, it’s someone with pre strength in API and ontology area (e.g. ontologist).
- Understand the enterprise data needs of the company or client, encompassing API schema design, business event design and choreography, as well as data platform.
- Create and maintain taxonomies and relevant metadata for the taxonomy.
- Ensure ontologies provide comprehensive domain coverage that are available for both human and machine ingestion and inference.
- Evangelize ontology and semantic technologies within organization.
- Work closely with product owners and system architects to understand the business needs for information content, both operational and analytical.
- Oversight / execution of conceptual and logical modeling in support of system design, API design and event design.
- Articulate in narrating conceptual models and evincing conceptual language from the business.
- Coordinate and consult design teams on API and event schema design, as well as aspects of event generation and choreography.
- Consult design teams on planning schema evolution in a coordinated manner.
- Help drive the consensus across regions (e.g. Europe and North America) on harmonizing regional models.
- Should learn and fully understand JSON schema and YAML semantics.
- Should learn and fully understand data sync and data movement patterns in API and business event eco-system.
- Work with regional teams to reason out and enforce enterprise logical models.
- Work across existing enterprise models and determine how the new additions will impact vs reuse of existing elements.
- Learn to author with YAML based logical models as software products in a devops CI/CD pipeline (Maven, Gitlab, Jenkins, Artifactory).
- Write SQL to profile the data in data warehouse, to profile the data as needed.
- Understand the context of the company’s migration from batch file integration to real-time integration via APIs and events.
- Understand the context of the company’s migration of data services needs from legacy to modern data platform.
Job Qualifications and Skill Sets
Below are the qualifications expected of a data modeler:
- Understanding of REST API and/or GraphQL API interface design.Or Have worked as Ontologist in previous role.
- Understanding of pub.sub (eg Kafka) and message design
- Expert in building data models – Conceptual & Logical data models (canonical models) for holistic enterprise data design.
- Should be self-driven, can work independently with little training on existing platform.
- Bachelor’s degree in computer science, data science, information technology, or data modelling. A master’s level of theory is a plus.
- Reliable in consulting on physical data models in JSON, YAML, and SQL.
- Attention to detail and very particular in taxonomy or naming of data model entities/attributes.
- Experience in building enterprise data models with large enterprise event driven eco-system and design optimal/best practices data models.
- Familiarity with data modeling software such as SAP PowerDesigner, Microsoft Visio, or erwin Data Modeler
- Excellent presentation, communication, and organizational skills
- Strong attention to detail
- Ability to work in a fast-paced environment
- Ability to work both independently and as part of a team
- Ability to intelligently profile and tune data using SQL.
Preferred:
- Knowledge of Domain Drive Design methodology
- Knowledge of NoSQL modelling
- Some full stack experience
- Knowledge of Graph QL APIs, Apollo , Apigee.
- Knowledge of Azure Cloud especially with Azure Synapse.