Required Skills

Data Engineer (Google Cloud)

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Sep 2021

JOB DETAIL

§ Support the design and develop complex ETL processes that integrate data sets from multiple sources
§ Create mappings and ETL workflows, data flows and stored procedure
§ Support system and user acceptance testing activities, including issue resolution
§ Prepare documentation on technical design of end-to-end data interfaces, unit test cases, unit test logs and deployment documentation
§ Experience with Python and bash scripting or PowerShell
§ Experience with infrastructure as code (Deployment Manager, Terraform)
§ Hands on experience using continuous integration/delivery tools like git, Jenkins and common development workflows in Github/Gitlab/Bitbucket
§ Experience with reporting schema designs including data modeling, denormalization, data warehousing, and data lakes.
§ Experience with data quality monitoring and alerting on dynamic data sources
§ Experience developing automated deployment scripts and tools for system provisioning and configuration
§ A working knowledge of IT infrastructure (server, network, storage, load balancing, clustering, etc.
§ Scripting abilities with one or more general purpose programming languages including but not limited to Java, C/C++, C#, Objective C, Python, JavaScript
§ Knowledge with Application migration or Data migration experience from on-prem to cloud is required.
§ Experience with reporting schema designs including data modeling, denormalization, data warehousing, and data lakes
§ Experience with data quality monitoring and alerting on dynamic data sources
§ Experience in developing DataFLows , Dataproc jobs , Spark jobs ,Spark SQL, Python scripts , Scala programmable scripts , Jupiter notebook.
§ Working experience with Google BigQuery and BigTable implementations
§ 3+ years of Experience in developing jobs using Dataproc and Spark jobs using DataProc Clusters
§ Working experience in GCP DataProc(Hadoop / HDFS environment ) and integration with BigQuery.
§ 3+ years of experience in BigData technologies (Hive , HDFS , Spark,Scala and Airflow )
§ 3+ years of experience in scripting using Python , PIG ,Scala ,SparkSQL.
§ Experience in Performance tuning techniques and techniques in handling huge volumes of data transformations.

 

Company Information