Required Skills

GCP Google BigQuery SQL Google Big Query GCP ETL Pipeline BI Cloud Skills Microsoft SQL with experience in both building and designing

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 7th Oct 2025

JOB DETAIL

  • You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer.
  • Contribute to this is unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology.
  • Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work
  • You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform.
  • Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization.
  • Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP.
  • Provide technical guidance and mentorship to the team to develop their skills and expertise in GCP.
  • Contribute to multiyear data analytics modernization roadmap for the bank.
  • Stay up-to-date with the latest GCP technologies, trends, and best practices and assess their applicability to client solutions.

Qualifications:

What will help you succeed:

  • Bachelors University degree computer science/IT
  • Master’s in data Analytics/Information Technology/Management Information System (preferred)
  • At least 3-5 years of professional experience in building data engineering capabilities for various analytics portfolios with at least 2 years in GCP/Cloud based platform.
  • Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts
  • Must have good implementation experience on various GCP’s Data Storage and Processing services such as Big Query, Dataflow, Bigtable, Data form, Data fusion, cloud spanner, Cloud SQL
  • Must have programmatic experience of SQL, Python, Apache Spark

Your expertise in one or more of the following areas is highly valued:

  • Google Cloud Platform, ideally with Google Big Query, Cloud Composer and Cloud Data Fusion, Cloud spanner, Cloud SQL
  • Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)
  • Experience with DBT (Data Build Tool) or any ETL tool , Terraform/Terragrunt, Git (CI/CD)
  • Experience with a testing framework.
  • Experience with Business Intelligence tools like PowerBI and/or Looker.

What sets you apart:

  • Experience in complex migrations from legacy data warehousing solutions or on-prem Data base or Data Lakes to GCP. Client has on-prem Enterprise level SQL Server DB.
  • Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks.
  • Experience in building real-time ingestion and processing frameworks on GCP.
  • Adaptability to learn new technologies and products as the job demands.
  • Multi-cloud & hybrid cloud experience
  • Any cloud certification (Preference to GCP Certifications)
  • Experience working with Financial and Banking Industry

Company Information