Required Skills

Data Architect with DevOps

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 21st Sep 2021

JOB DETAIL

We are looking for either DevOps Architect with strong Data background OR Data Architect with strong exp. in DevOps

Role: Data Architect with strong experience DevOps and Data Pipeline/Process Reengineering

Primary Skillsets: Informatica ETL, Data Architecting, Snowflake or modern Cloud Data Platforms, DevOps, CI/CD, Python3, Python API Development, Bash, Git, Gitlab
•       At least 10+ years of enterprise experience in Informatica ETL, Data Architecting, Data Engineering, 3+ years’ experience in Snowflake or modern Cloud Data Platforms
•       At least 5+ years of enterprise experience in DevOps, Platform and Cloud Engineering environment (Mandatory)
•       Experience in cloud application deployment automation patterns and various scripting languages and also in automation patterns targeting various languages. (Mandatory)
•       At least 4-5 years of enterprise experience with Jenkins automated build and continuous integration systems (CICD)
•       Experience with DevOps tools like Concourse, Jenkins, Artifactory, AWS, Azure etc.
•       At least 4-5 years of enterprise experience with Python. Mainly in Python3
•       At least 6-7 years of experience in Bash and/or Shell scripting. 
•       At least 2-3 years of experience in API development. Better if using Python
•       At least 3-4  years of enterprise experience building Data Pipelines, CICD pipelines, and fit-for-purpose Data Stores (Mandatory)
•       At least 4-5 years of experience in Git/GitLab, Jenkins etc. and automation/pipeline building around that. (Mandatory)
•       At least 2-3 years of experience in enterprise Configuration Scripting experience with Ansible, Chef or Terraform
•       Actively participating in planning, definition, design and integration of pipeline services working with partners in the enterprise to assure consistency of development. (Mandatory)
•       Owning the implementation of generic common component/service aspects of pipelines, which may include applications, infrastructure, security and/or database strategies (Mandatory)
•       Participating in the enforcement of coding standards for the team as well as implementing a strong internal, repeatable DataOps process
1.    Hands-on Snowflake Experience or at least Trained and Certified.
2.    Hands on Exp in ETL tools like Informatica, Ab Initio, Datastage, Talend etc.
3.    Cloud experience (nice to be on AWS, Lambda, EMR, EC2, S, CFT,  etc)
 

Company Information