Required Skills

GBQ GCP Python and ETL

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th Nov 2024

JOB DETAIL

 

Professional experience in Data Engineering using Google Big Query (GBQ) and Google Cloud Platform (GCP) data sets and building data pipelines. 
- Hands on and deep experience working with Google Data Products (e.g. BigQuery, Dataflow, Dataproc, Dataprep, Cloud Composer, Airflow, DAG etc.). 
Python Programming Expert PySparkPandas
- Experience in Airflow (Create DAG, Configure the variables in Airflow, Scheduling) 
Big Data technologies and solutions (Spark, Hadoop, Hive, MapReduce) and multiple scripting and languages (YAML, Python). 
- Experience in DBT to create the lineage in GCP. – Optional 
- Worked in Dev-Sec-Ops (CICD) environment.
- Design and develop the ETL ELT framework using BigQuery, Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc. 
- Experience in Data Integration, Data Transformation, Data Quality and Data Lineage tools.  
- Should be able to automate the data load from Big Query using APIs or scripting language.
- E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. 
- E2E Solution Design skills – Prototyping, Usability testing and data visualization literacy. 
- Experience with SQL and NoSQL modern data stores.

Company Information