Required Skills

ETL data analytics

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 28th Jul 2022

JOB DETAIL

A data analytics engineer is a specific data operative position within the data team - their role and job responsibilities lie at the intersection between analytics and engineering.

The analytics engineer role comes with three broad areas of responsibility:

1. Construct data pipelines. Whether it is ETL or ELT, the analytics engineer is responsible for delivering clean data sets to the end-users. The end-users can either be data scientists, data analysts, or other non-technical stakeholders (e.g. for self-service analytics). Constructing data pipelines involves everything from extracting raw data from multiple sources, cleaning data (data transformations), and loading data into a data warehouse or database so it is accessible to end-users.

2. Build data models. The analytic engineer spends a considerable amount of time designing data models to optimize both the data warehousing aspects (e.g. query execution time) and the data consumption aspects (e.g. building such a snowflake model to optimize analytical queries). The process is a cross-function of architectural engineering design and analytic knowledge.

3. Perform DataOps. From deploying data warehouses to managing the operational aspects of data pipelines, analytic engineers manage the DataOps cycle of data products. This includes monitoring the data quality throughout the system, orchestrating jobs, and validating metrics and other downstream results of data pipelines.

Data engineer:

• Takes care of data ingestion and integration for the entire company. Joining different and disparate data sources and real-time data streams together, using scripting or data tools such as Keboola, Stitch, Fivetran, etc. (check how they compare)

• Deploys, manages, and optimizes the data warehouse (Snowflake, BigQuery, Redshift, …).

• Performs DataOps, by orchestrating jobs and maintaining the cloud solution (AWS, Azure, GCP) or on-premise servers.

Thanks & Regards,

Company Information