Required Skills

Data Architect

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 4th Jun 2025

JOB DETAIL

•            In-depth knowledge of Apache Spark, Spark APIs [Spark SQL and DataFrame APIs, Spark Structured Streaming & Spark MLlib for analytics] and Kafka, can code in Scala /Java.

•            Knowledge of Flink,  streaming and batching modes, caching and optimizing performance.

•            Design and develop analytics workloads using Apache Spark and Scala for processing of big data

•            Create and optimize data transformation pipelines using Spark or Apache Flink

•            Proficiency in performance tuning and optimization of Spark jobs

•            Experience on migrating existing analytics workloads from cloud platforms to open-source Apache Spark infrastructure running on Kubernetes.

•            Expertise in data modeling and optimization techniques for large-scale datasets

•            Extensive experience with real spark production instance.

•            Strong understanding of Data Lake, Big Data, ETL processes, and data warehousing concepts

•            Good understanding of lakehouse storage technologies like Delta Lake and Apache Iceberg

•            AWS knowledge

Company Information