Required Skills

Spark Scala GCP Java Spark SQL PySpark HDFS (Hadoop) MapReduce

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 10th Jul 2024

JOB DETAIL

  • Skilled in Big Data Technologies like Spark, Scala, GCP,Java, Spark SQL, PySpark, HDFS (Hadoop), and MapReduce.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL as well as working familiarity with a variety of databases.
  • Experience with object-oriented/object-function scripting languages: Scala, and Java.
  • Experience building and optimizing big data pipelines, architectures, and data sets.
  • Strong analytic skills related to working with unstructured datasets.
  • Working knowledge of highly scalable big data' data stores.
  • A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • Experience developing enterprise software products.
  • Build monitoring and automated testing to ensure data consistency and availability.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Experience working in an AGILE environment.

Company Information