Required Skills

MYSQL Postgres and certification of GCP

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Nov 2024

JOB DETAIL

Function as member of an agile team by contributing to software builds through consistent development practices (tools, common components, and documentation)
Participate in code reviews and automated testing
Debug basic software components and identify code defects for remediation
Enable the deployment, support, and monitoring of software across test, integration, and production environments
Automate deployments in test or production environments
Automatically scale applications based on demand projections
Architect, design, develop and maintain enterprise capabilities and systems.
Collaborate with team/s in agile environment on creating, evolving, and maintaining engineering excellence principles related to Code, Code reviews, Testing methodologies (Unit, Integration etc.,) and defect management
Developing prototypes applying visualization and other techniques to fast-track concepts.Qualifications expected:
experience with relational database (RDBMS) administration, development, and tuning - specifically MySQL and Singlestore
experience with database support of OLTP and OLAP systems in a large dev and prod environment
experience with database development and support of Massive Parallel Processing (MPP) systems, HA and DR
experience in analytical and columnar databases and supporting multiple application teams(columnar Database like Jethro)
experience in resolving data ingestion issues, performance issues and day-to-day support
responsible for Linux Infrastructure administration , DBMS upgrades and shell scripting
 experience with transforming logical data architectures into physical data designs
 experience with Big Data tools, including Apache, Hadoop, Hive, and Spark distributed framework
 experience with Public cloud platform (GCP, AWS, ..) optimization, enabling managed and serverless services

Company Information