- Strong data analysis and data profiling skills
- Strong Conceptual, Logical and Physical data modeling for VLDB Data warehouse and Graph DB
- Hands on experience with modeling tool such as ERWIN or another industry standard tool
- Proficient in both normalized and dimensional model disciplines and techniques
- Minimum of 3 years’ experience in Oracle Database
- Hands experience with Oracle SQL, PL/SQL, or Cypher
- Exposure to Databricks Spark, Delta Technologies, Informatica ETL or other industry leading tools
- Good knowledge or experience with AWS Redshift and Graph DB design and management.
- Solid Understanding on AWS Cloud technologies, mainly on the services of VPC, EC2, S3, DMS and Glue.
- Bachelor’s degree in software engineering, computer science or information systems (or equivalent experience)
- Excellent verbal and written communication skill including the ability to describe complex technical concepts in relatable terms.
- Ability to lead and prioritize multiple workstreams and confidence in making decisions about how to prioritize work efforts
- Data-driven mentality. Self-motivated, responsible, diligent, and detail oriented
- Effective oral and written communication skills
- Ability to learn and maintain knowledge of multiple application areas
- Ability to work independently and in conjunction with a team
- Understanding of industry standard methodologies pertaining to Quality Assurance concepts and procedures
Responsibilities:
- Participate in requirements definition, analysis and the design of logical and physical data models for Dimensional Data Model, NoSQL or Graph Data Model
- Leading data discovery discussions with Business in JAD sessions and map the business requirements to logical and physical data modeling solutions.
- Conduct data model reviews with project team members
- Gather technical metadata through data modeling tools
- Ensure database designs efficiently support BI and end user requirements
- Drive continual improvement and enhancement of existing systems.
- Collaborate with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation.
- Collaborate with Data Architects for data model management, documentation, and version control.
- Maintain expertise and proficiency in the various application areas
- Maintain current knowledge of industry trends and standards
Education/Experience Level:
- Bachelor’s degree in Computer Science, Engineering, or relevant fields with 8+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or similar role for large scale enterprise solutions.
- 8+ years of experience with Data Modeling in finance, commercial, supply chains, procurement, Customer analytics
- 8+ years of experience of IT platform implementation in a technical role
- 8+ years of development and/or DBA experience in Relational Database Management Systems (RDBMS)
- 3+ years of experience in Data Transformation/ETL tools and technologies, and related concepts such as data catalog
- 5+ years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem including Data Lake, Data Warehouses, and Graph DB
- AWS Solutions Architect Professional Level certifications
- Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform and eCommerce systems