Required Skills

Business Intelligence Database Marketing Hadoop Spark Data Modeling Hdfs Etl Solution Design Sql Business Analysis

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 31st May 2022

JOB DETAIL


Role Description

We are looking for a Data Solutions Engineer to engage with the larger Conversant Engineering Data Warehousing teams.
This role will be hands on in code development to drive solutions to delivery by effectively engaging with team members across the globe.
The person in this role will need to be able to work independently to meet required specifications of solution delivery where the output work product will be incorporated into production environments managing all data processing within Conversant.
Expertise required in complex ETL , data processing , data aggregations.
Responsibilities You will design and code solutions on and off database for ensuring application access to enable data driven decision making for the companys multi-faceted ad serving operations.
Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable , accessible and evolving in lockstep with the needs of the ever-changing business model.
Should be able to develop test cases and validation methodology to demonstrate work product meets required needs Ideal candidate can lead in the areas of: solution design , code development , quality assurance , data modeling , business intelligence , cross team communication , project management , and application maintenance.
Qualifications Required qualifications Bachelors Degree in Computer Science or equivalent degree is required.
3+ years of business analysis experience around database marketing technologies and data management , and technical understanding in these areas.
Strong experience in Hadoop , HDFS , Spark is must.
Experience dealing with large volume data processing in terabytes scale is required Strong experience in SQL Experience with PostgreSQL or Greenplum or similar MPP Experience with Python scripting Experience with scheduling applications with complex interdependencies Good experience in working with geographically and culturally diverse teams Familiarity with complex data lake environments that span OLTP , MPP and Hadoop platforms Understanding of Disaster Recovery and Business Continuity solutions Excellent written and verbal communication skills.
Ability to handle complex products Excellent Analytical and problem solving skillsAbility to diagnose and troubleshoot problems quickly

Required Skills

Specifications, Solution Delivery, Test Cases, Methodology, Business Analysis, Database Marketing, Hadoop, HDFS, Spark, Data Processing, SQL, PostgreSQL, Python Scripting, Scheduling, Written and Verbal Communication Skills, Problem Solving Skills, Code Development, Able to work independently, Managing, ETL, Design, Access, Decision Making, Multi-Faceted Ad Serving Operations, Actionable, Business Model, Solution Design, Quality Assurance, Data Modeling, Business Intelligence, Communication, Project Management, Technical Understanding, OLTP, Hadoop Platforms, Disaster Recovery, Ability to handle

Company Information