Required Skills

GCP

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 30th Jun 2025

JOB DETAIL

  • Collaborate with cross-functional teams to design and implement scalable and reliable systems on Google Cloud Platform considering optimal performance, security, and cost-effectiveness.
  • Build Data ingestion pipelines to extract data from various sources (Azure Blob, Azure SQL, Flat files, Semi structure sources, AWS S3) into the data warehouse in GCP.
  • Utilize GCP services to build robust and scalable data solutions.
  • Design, develop, and maintain data pipelines and implement data architecture on GCP using services such as Big Query, Cloud Storage, and Cloud Composer.
  • Expertise in the tools and technology that helps in the process of data collection, cleaning, transforming, and modelling data to achieve useful information.
  • Leveraging GCP capabilities and technologies for migrating existing databases to cloud.
    Collaborate with cross-functional teams to understand data requirements and implement scalable solutions.
  • Implement and optimize Big Query tables and Complex SQL queries for efficient data retrieval, performance, and efficiency.
  • Experience in Data Migration from On-premises Database to Big query and experience in BQ conversion
  • Experience and knowledge in building data pipelines and scheduling using Cloud Composer (Airflow) and data and file transformation using Python

EDW (Enterprise Data Warehouse) and Data Model Designing:

  • Experience with Data modelling, Data warehousing and ETL processes
  • Work closely with Business and analysts to design and implement data models for effective data representation and analysis.
    Ensure data models meet industry standards and compliance requirements in the health-care domain.
    Contribute to the design and development of the enterprise data warehouse architecture.
    Implement best practices for data storage, retrieval, and security within the EDW.

Health-care Domain Knowledge:

Apply domain-specific knowledge to ensure that data solutions comply with health-care industry regulations and standards.
Stay updated on industry trends and advancements in health-care data management.

Collaboration:

  • Work collaboratively with cross-functional teams, including Business Teams, analysts, and software engineers, to deliver integrated and effective data solutions.
  • Participate in code reviews and provide constructive feedback to team members

Qualifications (Required Technical skills/Experience):

  • 1.Bachelor’s degree in computer science, Information Technology, or a related field.
    2. Proficiency in GCP and in-depth knowledge of GCP services, Big Query, Cloud Functions and Composer
  • 3. Strong programming skills in Data Engineering Python
  • 4. Experience with data modeling, SQL, and EDW design.
    6. Excellent problem-solving and analytical skills.
  • 7.Strong communication and collaboration skills.
  • 8. Proficiency in version control systems, particularly Git
  • 9. Strong understanding of Datawarehouse concepts, data lakes

Company Information