Required Skills

BigQuery GCP

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 30th Sep 2022

JOB DETAIL

  • 10+ years of experience and working as a Lead Data Engineer     
  • Senior Experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github
  • Experience working in GCP and Google Big Query Strong SQL knowledge - able to translate complex scenarios into queries
  • Mentor other data engineers, having a voice in defining the challenging technical culture, and helping to build a fast-growing team
  • Possess excellent written and verbal communication skills with the ability to communicate with team members at various levels, including business leaders
  • Coordinate with developers/architects/stakeholders and cross functional teams from organization and customer side
  • Strong Programming experience in Python or Java Experience with Data modeling and mapping.
  • Experience in Google Cloud platform (especially BigQuery) Experience developing scripts for flowing data into GBQ from external data sources.
  • Experience in Data Fusion for automation of data movement and QA. Experience with Google Cloud SDK & API Scripting.                                                                                                                                       
  • Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
  • Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification will be great
  • Data migration experience from on prim legacy systems Hadoop, Exadata, Oracle Teradata, or Netezza to any cloud platform
  • Experience with Data Lake, data warehouse ETL build and design
  • Experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.
  • Experience in implementing next generation data and analytics platforms on GCP cloud
  • Experience in Jenkins, Jira, confluence.

 

Company Information