Required Skills

DevOps Engineer

Work Authorization

  • US Citizen

  • Green Card

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 16th Mar 2023

JOB DETAIL

 

The Data Factory Platform covers all business processes and technical components involved in ingesting a wide range of enterprise data into the GDIA Data Factory (Data Lake) and the transformation of that data into consumable data sets in support of analytics. The Ingestion Patterns team will own end to end responsibility for the platform operating model which includes defining the business integration model, platform governance processes, problem management processes and resiliency actions. Additionally, the team will be responsible for defining standards and best practices across all tenants and consumers of Platform resources. Data Factory Engineers will be responsible for designing ingestion patterns and modernization of 's big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies. Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.

 

Skills Required:

 

Python Airflow Apache beam CI/CD using Terraform/Tekton

 

Experience Required:

 

1) 10+ Years of total IT experience , 2-3 years Hands on experience in implementing GCP services in enterprise setup 2) 2-3 years of python development experience 3) Expert in airflow and Apache beam 4) 3-4 year’s experience in implementing IAC and CICD pipelines

Company Information