Required Skills

DATA ARCHITECT

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Sep 2021

JOB DETAIL

Technical leadership in Information and Data Architecture, working with Enterprise Data Architects leveraging the TOGAF architecture methodology with oversight of Domain Modeling, Logical Data Modeling and Physical Data Model implementation.

Apply modern data management toolsets and coding methods to design, build, implement, and optimize data solutions of all types – including data warehouses, data lakes, ODS, streaming data, analytic and BI/visualizations, etc.

Translate business issues and needs into Data and System requirements and Architect the management of data assets and their flow through the enterprise

Architect and Design Data Services (DaaS) for data consumption and manipulation throughout the Data Ecosystem applying the “contract first” design principle and including use of API, Microservices, Microbatch, ELT Pipeline and other methods.

Transform legacy data structures and processes to modern, capable, and secure solutions in a hybrid cloud setup

Apply Data Engineering & Design best practices to architect solutions, using a deep understanding of various data formats and database design approaches

Architect Data Storage solutions for OLTP (CRM, etc.) Systems, Analytics Platforms, Data Lake, Data Warehouse with Relational Database and Object Storage methods tailored for best fit for the needs.

Architect best-practice data ingestion framework for batch and real-time data flows, develop tooling for increasing scale, accuracy and automation in data pipeline, to integrate with decisioning, AI/NLP and consuming systems

Architect Data Catalogue and Metadata Management

Architect and Model for Master Data Management.

Architect for various Analytics Method including Descriptive, Diagnostic, Predictive, Prescriptive and Capabilities including Realtime Analytics, Advanced Analytics, Machine Learning (ML/AI/NLP)

Work with Enterprise Architects and Information Security Architects to design highly secure data platform ecosystem by designing controls and protection strategies

Enable application performance and modernization by creating appropriate data capabilities to match.

Determine best-in-breed Tools and Technologies, leveraging CNCF-backed Open Source, Managed Solutions and Engineered solutions where applicable.

5+ years of experience in data analysis, engineering, architecture and operations roles, including experience with transformational efforts

Strong Database skills, with RDBMS (E.g., Oracle, SQL) as well as modern relational and unstructured data sources (like NoSQL), including cloud services (AWS/GCP/Azure).  Hands on experience using tools is strongly preferred

Experience with Tools (or similar) such as Hadoop Stack, Airflow, Kafka, NiFi, PostgreSQL, Oracle, SQL Server, ElasticSearch (ELK), JSON, Parquet, Avro and other Data Storage formats, Tableau, Superset and other Visualization Tools, Apache Atlas, and other Data-centric Apache Packages

Extensive Knowledge of Design Patterns for Software and Data Engineering.

Experience Coding with Java, Javascript (Nodejs), Python, GO, Rust and similar.

Experience in on-prem and hybrid cloud infrastructure, including service and cost optimization

Experience with production and analytics data, batch and real time / streaming, etc.

Experience in regulated industries preferred (such as financial services, insurance, healthcare, etc.)

Familiarity with optimization tools and techniques, including Bayesian modelling and variety of machine learning techniques

Ability to manage large programs and projects will be essential

 

Company Information