Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 14th Dec 2023

JOB DETAIL

·Design, develop, and maintain new data capabilities and infrastructure for utilizing Mastercard, third-party, and partner data to enhance Mastercard's data products and solutions.

·Create new data pipelines, data transfers, and compliance-oriented infrastructure to facilitate seamless data utilization within cloud environments.

·Identify existing data capability and infrastructure gaps or opportunities within and across initiatives and provide subject matter expertise in support of remediation.

·Collaborate with technical team and business stakeholders to understand data requirements and translate them into technical solutions.

·Work with large datasets, ensuring data quality, accuracy, and performance.

·Implement data transformation, integration, and validation processes to support analytics and reporting needs.

·Optimize and fine-tune data pipelines for improved speed, reliability, and efficiency.

·Implement best practices for data storage, retrieval, and archival to ensure data accessibility and security.

·Troubleshoot and resolve data-related issues, collaborating with the team to identify root causes.

·Document data processes, data lineage, and technical specifications for future reference.

·Participate in code reviews, ensuring adherence to coding standards and best practices.

·Collaborate with DevOps teams to automate deployment and monitoring of data pipelines.
 

EXPERIENCE
 

•       Bachelor's or Master's degree in Computer Science, Engineering, or a related field

•       Proven experience in data engineering, with a strong track record of designing and implementing data solutions

•       Proficiency in programming languages such as Python, Java, or Scala, and experience with data processing frameworks (Spark, Hadoop, etc.)

•       In-depth understanding of data warehousing concepts, cloud platforms (AWS, Azure, GCP), and data modeling techniques

•       Experience dealing with large volumes of data, from various sources, both structured and unstructured.

•       Ability to triage and talk through performance / scaling issues of dealing with data at scale.

•       Good understanding of how data will be read (file formats, partitioning, bucketing).

•       Extensive experience writing testable jobs using Spark (or equivalent) framework.

•       Programming & Scripting Languages: Java EE, Scala, Spark, SQL, Bash.

•       Web services & API standards: REST, OAuth, JSON.

•       Software Architectures (micro-services, event driven, peer-to-peer).

•       Application Security.

•       Asynchronous Pub-Sub and Point to Point Messaging Systems.

•       Advantage, if you have experience working in ETL and Hadoop Ecosystem: HBase, Solr, Spark Streaming, Kudu, Spring Boot, Spring Context, Spring Data Rest, General Cloudera experience.

•       Streaming within the Hadoop ecosystem is a plus.

 

Company Information