Required Skills

Big Data Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 26th Jan 2023

JOB DETAIL

Lead Data Engg Focused Role: 10 years of experience and working as a Lead Data Engineer # Senior Experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github

# Experience working in GCP and Googl e Big Query Strong SQL knowledge - able to translate complex scenarios into queries # Mentor other data engineers, having a voice in defining the challenging technical culture, and helping to build a fast-growing team

# Possess excellent written andv erbal communication skills with the ability to communicate with team members at various levels, including business leaders

# Coordinate with developers architects stakeholders and cross functional teams from organization and customer side

# Strong Pr ogramming experience in Python or Java Experience with Data modeling and mapping.

# Experience in Google Cloud platform (especially BigQuery) Experience developing scripts for flowing data into GBQ from external data sources.

# Experience in DataFu sion for automation of data movement and QA. Experience with Google Cloud SDK & API Scripting.

# Experience in per forming detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud

# Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification will be great

# Data mig ration experience from on prim legacy systems Hadoop, Exadata, Oracle Teradata, or Netezza to any cloud platform # Experience with Data Lake, data warehouse ETL build and design

# Experience in designing and building production data pipelines fromdat a ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.

# Experience in implementing next generation data and analytics platforms on GCP cloud # Experience in Jenkins, Jira, confluenc

Company Information