Databricks UIManaging Databricks NotebooksDelta Lake with PythonDelta Lake with Spark SQLDelta Live TablesUnity Catalog
Work Authorization
US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Employment Type
Consulting/Contract
education qualification
UG :- - Not Required
PG :- - Not Required
Other Information
No of position :- ( 1 )
Post :- 23rd Nov 2023
JOB DETAIL
10+ years of overall IT experience
3+ years of experience with high-velocity high-volume stream processing: Apache Kafka and Spark Streaming
Experience with real-time data processing and streaming techniques using Spark structured streaming and Kafka
Deep knowledge of troubleshooting and tuning Spark applications
3+ years of experience with data ingestion from Message Queues (Tibco, IBM, etc.) and different file formats across different platforms like JSON, XML, CSV
3+ years of experience with Big Data tools/technologies like Hadoop, Spark, Spark SQL, Kafka, Sqoop, Hive, S3, HDFS, or
3+ years of experience building, testing, and optimizing ‘Big Data’ data ingestion pipelines, architectures, and data sets
2+ years of experience with Python (and/or Scala) and PySpark/Scala-Spark
3+ years of experience with Cloud platforms e.g. AWS, GCP, etc.
3+ years of experience with database solutions like Kudu/Impala, or Delta Lake or Snowflake or BigQuery
2+ years of experience with NoSQL databases, including HBASE and/or Cassandra
Experience in successfully building and deploying a new data platform on Azure/ AWS
Experience in Azure / AWS Serverless technologies, like, S3, Kinesis/MSK, lambda, and Glue
Strong knowledge of Messaging Platforms like Kafka, Amazon MSK & TIBCO EMS or IBM MQ Series
Experience with Databricks UI, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL, Delta Live Tables, Unity Catalog
Knowledge of Unix/Linux platform and shell scripting is a must
Strong analytical and problem-solving skills
Preferred:
Strong SQL skills with ability to write intermediate complexity queries
Strong understanding of Relational & Dimensional modeling
Experience with GIT code versioning software
Experience with REST API and Web Services
Good business analyst and requirements gathering/writing skills