Required Skills

DevOps Engineer with OpenShift & Python3

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Aug 2021

JOB DETAIL

Python3, lambda, bash, Git, Gitlab, Python API Development and developing APIs in Kubernetes such as Openshift Container Platform.

Mandatory:
•       At least 4-5 years of enterprise experience in DevOps, Platform and Cloud Engineering environment
•       Experience in cloud application deployment automation patterns and various scripting languages and also in automation patterns targeting various languages.
•       At least 4-5 years of enterprise experience with Jenkins automated build and continuous integration systems (CICD)
•       Experience with DevOps tools like Concourse, Jenkins, Artifactory, AWS, Azure etc.
•       At least 4-5 years of enterprise experience with Python. Mainly in Python3
•       At least 6-7 years of experience in Bash and/or Shell scripting. 
•       At least 2-3 years of experience in API development. Better if using Python
•       At least 2-3 years of experience in developing APIs in Kubernetes such as REDHAT OpenShift Container Platform.
•       At least 2-3 years of experience with serverless AWS Lambda coding.
•       At least 2-3 years of enterprise experience with monitoring tools (Splunk or Cloud Native)
•       At least 3-4  years of enterprise experience building Data Pipelines, CICD pipelines, and fit-for-purpose Data Stores
•       At least 4-5 years of experience in Git/GitLab, Jenkins etc. and automation/pipeline building around that.
•       At least 2-3 years of enterprise Configuration Scripting experience with Ansible, Chef or Terraform
•       Actively participating in planning, definition, design and integration of pipeline services working with partners in the enterprise to assure consistency of development.
•       Owning the implementation of generic common component/service aspects of pipelines, which may include applications, infrastructure, security and/or database strategies
•       Participating in the enforcement of coding standards for the team as well as implementing a strong internal, repeatable DataOps process
 
Nice to Have:
1.    Hands-on Snowflake Experience or at least Trained and Certified.
2.    Hands on Exp in ETL tools like Informatica, Ab Initio, Datastage, Talend etc.
3.    Cloud experience (nice to be on AWS, Lambda, EMR, EC2, S, CFT,  etc)

Company Information