Required Skills

Spark python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Sep 2022

JOB DETAIL

Looking for ETL Data Engineer

ETL Data Engineer
Location-Alpharetta, GA,
hybrid model day 1oniste will work here.

Data Engg Focused Role:
•       Senior Experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github
•        Experience working in GCP and Google Big Query Strong SQL knowledge - able to translate complex scenarios into queries. 
•        Strong Programming experience in Python or Java Experience with Data modeling and mapping. 
•        Experience in Google Cloud platform (especially BigQuery) Experience developing scripts for flowing data into GBQ from external data sources. 
•        Experience in Data Fusion for automation of data movement and QA. Experience with Google Cloud SDK & API Scripting.
•       Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
•       Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification will be great
•       Data migration experience from on prim legacy systems Hadoop, Exadata, Oracle Teradata, or Netezza to any cloud platform
•       Experience with Data lake, data warehouse ETL build and design
•       Experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.
•       Experience in implementing next generation data and analytics platforms on GCP cloud
•        Experience in Jenkins, Jira, confluence.
Data engineering or Data profiling and Data warehousing.
• SQL or Spark, python or Java real-time expertise are must.

Company Information