Required Skills

Hadoop SQL

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 28th Feb 2024

JOB DETAIL

Big Data Engineer. GCP is now an absolute requirement for this role.

5+ years of software development experience and leading teams of engineers and scrum teams 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark) Hands-on experience on writing and understanding complex SQL(Hive/PySpark-dataframes), optimizing joins while processing huge amount of data Experience in UNIX shell scripting Additional Good to have requirements:

Solid Datawarehousing concepts Knowledge of Financial reporting ecosystem will be a plus Experience with Data Visualization tools like Tableau, SiSense, Looker Expert on Distributed ecosystem Hands-on experience with programming using Python/Scala Expert on Hadoop and Spark Architecture and its working principle Ability to design and develop optimized Data pipelines for batch and real time data processing Should have experience in analysis, design, development, testing, and implementation of system applications Demonstrated ability to develop and document technical and functional specifications and analyze software and system processing flows Aptitude for learning and applying programming concepts. Ability to effectively communicate with internal and external business partners. Preferred Qualifications: **Knowledge of cloud platforms like GCP/**AWS, building Microservices and scalable solutions, will be preferred 2+ years of experience in designing and building solutions using Kafka streams or queues Experience with GitHub and leveraging CI/CD pipelines Experience with NoSQL i.e., HBase, Couchbase, MongoDB

Company Information