Required Skills

Google Cloud Platform

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • 1099-Contract

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :-

Other Information

  • No of position :- ( )

  • Post :- 28th Dec 2020

JOB DETAIL

Cloud (GCP) Software Developer

Location and Travel: Remote, Woodland Hills, CA

Roles/Responsibilities:

  • Design Build and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Big Table, Cloud Big Query, Cloud Pub Sub, Cloud Functions, etc.
  • Analyzing, re-architecting and re-platforming on premise data warehouses to data platforms on GCP cloud using GCP/3rd party services
  • Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
  • Architecting and implementing next generation data and analytics platforms on GCP cloud
  • Designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
  • Working with recommendation engines, data pipelines, or distributed machine learning and experience with data analytics and data visualization techniques and software
  • Performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
  • Designing and implementing at production scale
  • Willing to work in various client sites throughout the U.S. Must be willing to travel and/or relocate.

Required Qualifications

  • Minimum 7 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
  • Minimum 2 years of hands-on experience analyzing, re-architecting and re-platforming on premise data warehouses to data platforms on GCP cloud using GCP/3rd party services
  • Minimum 2 years of designing and building production data pipelines from ingestion to consumption within a hybrid bigdata architecture, using Java, Python, Scala etc.
  • Minimum 2 years of architecting and implementing next generation data and analytics platforms on GCP cloud should have worked in agile environment.
  • Proven ability to work effectively in a fast-paced, interdisciplinary, and deadline driven environment.
  • Proficient at financial modeling and quantitative analysis.
  • Strong problem solving and troubleshooting skills.

Company Information