US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 21st Oct 2025
Seeking an experienced, resourceful, full stack engineer who can adapt and hit the ground running with minimal supervision. This individual will be passionate about end-user experience and best-in-class engineering excellence and will be part of a tight-knit, distributed engineering team developing and delivering a comprehensive data operations management solution for Equifax's Data Fabric Platform.
Data Fabric is a GCP cloud-native modern data management platform which allows Equifax to acquire and curate data, provide entity resolution, and ingest into a single environment. It is deployed globally in multiple regions, highly secured and complies with regional and internal regulatory controls with strict governance and oversight. Business units, Data Scientists and many other stakeholders use APIs to consume data managed by the Data Fabric and operate data exchanges to monetize data through B2B and B2C channels.
Data operations management solution consists of:
· A web portal UI/UX that provides a single point of access to all data management and data reliability engineering
· A suite of backend API services that services the UI and integrates with low-level Data Fabric and other third-party system APIs
· Modern data lakehouse (data lake, data warehouse, batch and streaming ELT pipelines)
The data operations roadmap envisions a set of rich management capabilities, including:
· Serves a large community of geographically dispersed data operations stakeholders
· Data quality and observability management to detect, alert, and prevent data anomalies
· Troubleshooting, triaging and resolving data and data pipeline issues
· OLAP, batch and streaming big data processing, and BI reporting
· MLOps
· Real-time dashboards, alerting and notifications, case management, user/group management, AuthZ, and many other foundational capabilities
Tech Stack
· Frontend: Angular 17+, JavaScript, TypeScript, HTML, SCSS, Webpack Module Federation, Tailwinds CSS, Angular Material, Angular Elements
· Backend: Java (JDK 17+), Spring Framework 6.X.X, Spring Boot 3.X.X, NestJS 10.X.X, REST and GraphQL microservices, NodeJS
· Tools & Frameworks: Nx build management, Monorepo architecture, Jenkins CI/CD, Fortify, Sonar, GitHub
· Cloud & Data: GCP (GKE, Composer + Airflow, Dataflow + Apache Beam, BigQuery, BigTable, Firestore, GCS, PubSub, Vertex AI), Terraform, Helm Charts, GitOps
· Other Technologies: Websockets, SSE, event-driven architecture
Environment
· Culture: Fast-paced, creative, results-oriented
· Team Structure: Agile, working in 2-week sprints using Aha and Jira for project management
· Expectations: Self-starters who can work independently with limited guidance, delivering solutions that end-users value and love
Responsibilities
· End-to-End Development: Design, develop, test, deploy, and operate software applications, covering both frontend and backend
· Cross-Functional Work: Collaborate with global teams to integrate with existing internal systems and GCP cloud
· Issue Resolution: Triage and resolve product or system issues, ensuring quality and performance
· Documentation: Write technical documentation, support guides, and run books
· Agile Practices: Participate in sprint planning, retrospectives, and other agile activities
· Compliance: Ensure software meets secure development guidelines and engineering standards
Must-Have Skills
· Cloud-Native Application Development: Experience with GCS or AWS
· Java Expertise: 5+ years with Java (JDK 17+), Spring Framework, Spring Boot
· Frontend Development: 5+ years with Angular, JavaScript, TypeScript, modern responsive web applications
· Architecture Knowledge: Understanding of modular systems, performance, scalability, security
· Agile Experience: Agile development mindset and experience
· Service-Oriented Architecture: Knowledge of RESTful web services, JSON, AVRO
· Application Troubleshooting: Debugging, performance tuning, production support
· Test-Driven Development: Unit, integration, and load testing experience
· Documentation Skills: Strong written and verbal communication
Nice-to-Have Skills
· Linux/Unix: Bash shell scripting
· Scripting Languages: Groovy, Python
· Infrastructure as Code: Experience with Terraform
· Containerization & Orchestration: Docker, Kubernetes
· Cloud Certification: Relevant certifications in cloud technologies
What could set you apart?
· Big Data Processing: ETL/ELT experience; Hadoop, HDFS, Spark, PySpark, Flink or similar experience
· GraphQL: Experience with GraphQL
· Reactive Development: Experience with reactive application development
· Distributed Application Development: Experience with multi-tenant applications