Required Skills

Java Python Spark

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Nov 2023

JOB DETAIL

- Design, develop, and implement scalable data pipelines and ETL processes using Java, Python, and Spark.

- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design efficient solutions.

- Manage and optimize Spark clusters to ensure high performance and reliability.

- Perform data exploration, data cleaning, and data transformation tasks to prepare data for analysis and modeling.

- Develop and maintain data models and schemas to support data integration and analysis.

- Implement data quality and validation checks to ensure accuracy and consistency of data.

- Utilize REST API development skills to create and integrate data services and endpoints for seamless data access and consumption.

- Monitor and troubleshoot data pipeline performance, identifying and resolving bottlenecks and issues.

- Stay updated with the latest technologies and trends in big data, data engineering, data science, and REST API development, and provide recommendations for process improvements.

- Mentor and guide junior team members, providing technical leadership and sharing best practices.

 

Qualifications:

- Master's degree in Computer Science, Data Science, or a related field.

- Minimum of 3 years of professional experience in data engineering, working with Java, Python, Spark, and big data technologies.

- Strong programming skills in Java and Python, with expertise in building scalable and maintainable code.

- Proven experience in Spark cluster management, optimization, and performance tuning.

- Solid understanding of data science concepts and experience working with data scientists and analysts.

Proficiency in SQL and experience with relational databases (e.g., Snowflake, Delta Tables).

- Experience in designing and developing REST APIs using frameworks such as Flask or Spring.

- Familiarity with cloud-based data platforms (e.g.Azure)

- Experience with data warehousing concepts and tools (e.g., Snowflake, BigQuery) is a plus.

- Strong problem-solving and analytical skills, with the ability to tackle complex data engineering challenges.

- Excellent communication and collaboration skills, with the ability to work effectively in a team-oriented environment.

Company Information