Required Skills

Python.

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Jan 2024

JOB DETAIL

Global Financial firm with offices in Newark, NJ is seeking a Fixed Income Data Engineer to design, develop, and maintain ETL processes and data pipelines to collect and transform data from various sources (e.g., databases, APIs, logs) into a structured and usable format.  Work in an agile environment through collaboration, ownership and innovation.

MUST HAVE:

·5+ years of experience in building out Data pipelines in Python.

·Experience developing and deploying PySpark/Python scripts in cloud environment.

·Experience working in AWS Cloud especially services like S3, Glue, Lambda, Step Functions, DynamoDB, ECS etc.

·Strong knowledge of data warehousing AND ETL processes

·Python API Development/Snowflake Snowpark coding experience

·Understanding of capital markets within Fixed Income “Structured products” Bonds” FX” etc

Our Role:

·Design, develop, and maintain ETL processes and data pipelines to collect and transform data from various sources (e.g., databases, APIs, logs) into a structured and usable format.

·Create and maintain data storage solutions, such as data warehouses, data lakes, and databases. Optimize data storage structures for performance and cost-effectiveness.

·Integrate and merge data from different sources while ensuring data quality, consistency, and accuracy.

· Manage and optimize data warehouses to store and organize data for efficient retrieval and analysis.

· Cleanse, preprocess, and transform data to meet business requirements and maintain data quality.

·data pipelines and database performance to ensure data processing efficiency.

·Implement and maintain security measures to protect sensitive data and ensure compliance with data privacy regulations

·Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications.

·Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII).

·Build new enterprise Datawarehouse and maintain the existing one.

·Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our AWS cloud migration

·Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance.

·Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data.

Your Required Skills:

·5+ years of experience in building out Data pipelines in Python.

·Strong knowledge of data warehousing, ETL processes, and database management.

·Proficiency in data modeling, database design, and SQL.

·3+ years of experience developing and deploying PySpark/python scripts in cloud environment.

·3 + years of experience working in AWS Cloud especially services like S3, Glue, Lambda, Step Functions, DynamoDB, ECS etc.

·1+ years of hands-on experience in the design & development of data ingress/egress patterns on Snowflake

·Proficiency in Aurora Postgres database clusters on AWS

·Familiarity with Orchestration tools like Airflow, Autosys, etc

·Experience with data lake/data marts/data warehouse

·Proficiency in SQL, data querying, and performance optimization techniques

·Ability to communicate the status, challenges and proposed solution with the team.

·Demonstrating the ability to learn new skills and work as a team.

·Knowledge of data security and privacy best practices.

·Working knowledge of data governance and ability to ensure high data quality is maintained throughout the data lifecycle of a project.

·Knowledge of data visualization and business intelligence tools (e.g., Tableau, Power BI).

·Ability to prioritize multiple tasks and projects and work effectively under pressure; exceptional organizational and administrative skills; at ease with abundance of details, yet mindful of big picture at all times.

·Strong analytical and problem-solving skills, with ability to conduct root cause analysis on system, process or production problems and ability to provide viable solutions.

·Experience working in an Agile environment with Scrum Master/Product owner and ability to deliver.

 

Your Desired Skills:

·Good exposure to Containers like ECS or Docker

·Python API Development/Snowflake Snowpark coding experience

·Streaming or messaging knowledge with Kafka or Kinesis is desirable.

·Understanding of capital markets within Fixed Income

Company Information