Bachelors or masters degree in computer science, software engineering or a related fields.
Minimum of 8-10 years of experience in software engineering.
Expertise in java programming language including frameworks like Spring Boot etc.,
Expertise with API development (REST, GraphQL), using API Gateways etc.,
Expertise in implementing scalable containerized computing and solutions for example Kubernetes, AWS ECS etc.,
Good understanding of cloud databases such as AWS Aurora Postgres, DynamoDB, MongoDB for efficient data processing, ingestion & distribution.
Good understanding of optimizing database performance through sharding, partitioning, indexing, query optimization, and other tuning techniques. Expertise in using cloud-based data storage solutions like Amazon S3 or Google Cloud Storage.
Good hands-on experience on writing SQLs working with large structured and unstructured data.
Experience in workflow orchestration tools like Airflow Orkes etc., Proficiency in data streaming services such as Kafka, Kinesis, and IBM MQ
Expertise in developing programs to produce and read payloads from data streaming solutions like Kafka, Kinesis Data Streams.
Proficiency in data formats such as JSON, HL7, EDI, and XML Good understanding of MDM systems/concepts and tools like Reltio, IBM Infosphere etc., System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira
Infrastructure as Code (IaC) technologies such as CloudFormation or Terraform
Familiarity with AI/MLOps concepts and Generative AI technology (Good to have)
Deep technical knowledge of 1 public cloud services (preferably AWS)
Excellent problem-solving and analytical skills
Strong communication and collaboration skills with both technical and non-technical stakeholders.