Required Skills

Java Scala S3 Glue Redshift AWS

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Dec 2023

JOB DETAIL

6-8 years of IT experience focusing on enterprise data architecture and management. 

• Experience in Conceptual/Logical/Physical Data Modelling & expertise in Relational and Dimensional Data Modelling 

• Experience with Databricks & on Prem , Structured Streaming, Delta Lake concepts, and Delta Live Tables required 

• Experience with Spark scala and java programming 

• Data Lake concepts such as time travel and schema evolution and optimization 

• Structured Streaming and Delta Live Tables with Databricks a bonus 

• Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support 

• Advanced level understanding of streaming data pipelines and how they differ from batch systems 

• Formalize concepts of how to handle late data, defining windows, and data freshness 

• Advanced understanding of ETL and ELT and ETL/ELT tools such as Data Migration Service etc 

• Understanding of concepts and implementation strategies for different incremental data loads such as tumbling window, sliding window, high watermark, etc. 

• Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus 

• Familiarity with concepts such as late data, defining windows, and how window definitions impact data freshness 

• Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design performance optimization) 

• Indexing and partitioning strategy experience 

• Debug, troubleshoot, design and implement solutions to complex technical issues 

• Experience with large-scale, high-performance enterprise big data application deployment and solution 

• Architecture experience in AWS environment a bonus 

• Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus 

• Experience with Gitlabs and CloudWatch and ability to write and maintain gitlabs for supporting CI/CD pipelines 

• Experience working with

Company Information