Minimum 8+ year experience of Architecting/Solutioning both in Big Data and DW
Experience on Spark (preferred) / PySpark, & other Big Data Projects
Experience in working on Cloud Implementations, AWS ecosystem - S3, EC2,..
Good knowledge of Snowflake data warehouse
Knowledge of BI reporting tools and frameworks is advantage, especially Tableau
Knowledge of data ingestion ETL tools and frameworks is advantage, especially Talend, DBT, Boomi
Should have the sound knowledge on Cloud Implementation, preferably AWS
Should have knowledge on Data Ingestion (batch and real-time)
Design & recommend the database objects design, cluster requirement
Should be proficient in technical documents like UML, HLD, LLD
Must have good analysis for troubleshooting problems & problem solving skills
Must have good communication skills
Central data publication for source and refined data sets
Data Governance & Data Security throughout the environment
Responsibilities:
A Technical Architect defines and owns the technical architecture of systems to deliver business objectives while ensuring quality standards.
Responsible for high-level requirement gathering, consulting, design, development, and definition of the technical architecture as per business requirements.
Enable the creation of designs and frameworks for models and validate their adherence with the overall architecture.
Will be responsible for technically leading the software projects through all stages of the life cycle, including responsibility for requirements capture, design, development and acceptance testing.
Lead and direct the technical team to do POC to take the critical architectural decision.
Work with client architects# team to complete the architecture
Work closely with BA team to groom technical requirements and validation the system requirements
Create the Solutioning & HLD for the new applications
Review the code and provide the guidelines to the technical lead