Required Skills

Glue Lambda Step Redshift

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 4th Mar 2024

JOB DETAIL

  • Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness.
  • Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms.
  • Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost.
  • Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc.
  • Build data pipelines by building ETL processes (Extract-Transform-Load)
  • Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data.
  • Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs.
  • Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements.
  • Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems.
  • Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security.
  • Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way.
  • Coordinate with release management, other supporting teams to deploy changes in production environment.

Minimum Qualifications

  • Bachelor's degree in computer science, engineering, or a related field (or equivalent work experience).
  • Strong experience in designing, implementing, and managing AWS data services.
  • Working experiences of implementing data lake using services like Glue, Lambda, Step, Redshift
  • Experience of Databricks will be added advantage.
  • Strong experience in Python and SQL
  • Strong understanding of security principles and best practices for cloud-based environments.
  • Experience with monitoring tools and implementing proactive measures to ensure system availability and performance.
  • Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment.
  • Strong communication and collaboration skills to work effectively with cross-functional teams.

Preferred Qualifications/skills

  • Master’s Degree-Computer Science, Electronics, Electrical.
  • AWS Data Engineering & Cloud certifications, Databricks certifications
  • Experience with multiple data integration technologies and cloud platforms.
  • Knowledge of Change & Incident Management process

Company Information