Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 30th Jan 2025

JOB DETAIL

  • Design and implement comprehensive data architecture strategies that meet the current and future business needs;
  • Develop and document data models, data flow diagrams, and data architecture guidelines;
  • Ensure data architecture is compliant with data governance and data security policies;
  • Collaborate with business stakeholders to understand their data requirements and translate them into technical solutions;
  • Evaluate and recommend new data technologies and tools to enhance data architecture;
  • Build, maintain, and optimize ETL/ELT pipelines for data ingestion, processing, and storage across batch and real-time data processing;
  • Build, maintain, and optimize Data Quality rules leveraging DQ tools and/or other ETL/ELT tools;
  • Develop and deploy scalable data storage solutions using AWS, Azure and GCP services such as S3, Redshift, RDS, DynamoDB, Azure Data Lake Storage, Azure Cosmos DB, Azure SQL DB, GCP Cloud Storage etc.;
  • Implement data integration solutions using AWS Glue, AWS Lambda, Azure Data Factory, Azure Functions, GCP Functions, GCP Dataproc, Dataflow and other relevant services;
  • Design and manage data warehouses and data lakes, ensuring data is organized and accessible;
  • Monitor and troubleshoot data pipelines, data warehouses and workflows to ensure data quality, system reliability, performance and cost management;
  • Implement IAM roles and policies to manage access and permissions within AWS, Azure, GCP;
  • Use AWS CloudFormation, Azure Resource Manager templates, Terraform for infrastructure as code (IaC) deployments;
  • Use AWS, Azure and GCP DevOps services to build and deploy DevOps pipelines;
  • Implement data security best practices using AWS, Azure, GCP, Snowflake or Databricks;
  • Optimize Cloud resources for cost, performance, and scalability;
  • Strong proficiency in SQL and experience with relational databases;
  • Proficient in programming languages such as Python, Java, or Scala;
  • Familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus;
  • Experience with machine learning and data science workflows is a plus;
  • Knowledge of data governance and data security best practices;
  • Strong analytical, problem-solving, and communication skill; and,
  • Ability to work independently and as part of a team in a fast-paced environment.

 

Company Information