Required Skills

cloud infrastructure R SAS Oracle Azure data modeling GCP ETL development SPSS AWS Data Analysis Data Profiling Python

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Aug 2023

JOB DETAIL

BE|BTech |MCA| BSC andidates having 2 to 6 year of work experience as follows: Job Description:
Strong analytical, technical and domain skills, high conceptual problem-solving mindset, high level of comfort in dealing with high volume.
Demonstrated experience with techniques, programming, and tools relevant to Meta data mapping and analysis including a high level of proficiency with large databases/data warehouse (Oracle, Teradata) and tools such as SAS / SQL. Experience with SQL and ability to write complex queries.
Experience in supporting data modeling, ETL development, data warehousing, data pipeline and data lake creation projects.
Experience with cloud infrastructure, data ingestion, data wrangling and visualization tools for Oracle/Azure/GCP/AWS.
Ability to articulate business problem statement into technical requirements.
Data Analysis, Data Profiling, Data Management skills.
Use automated tools to extract/acquire data from primary or secondary data sources
Provide quality assurance of imported data. Filter and “clean” Data by reviewing reports and performance indicators to identify and correct problems.
Support initiatives for data integrity and normalization.
Perform Exploratory Data Analysis to assess and interpret meaning of data. Scrutinize data to recognize and identify trends, and patterns. Use statistical tools to identify, analyze, and interpret patterns and trends in complex data sets that could be helpful for the diagnosis and prediction.
Assess and compare Financial and Operational metrics across time periods at Enterprise, regional, and segments level.
Generate reports from single or multiple systems in various formats, including interactive dashboards, Excel, Ppt, and others.
Prepare final analysis reports for the stakeholders, enabling them to take important decisions based on various facts and trends.
Monitor performance and identify improvements.
Work with different functional teams, and business management heads to identify improvement opportunities.
Train end-users on new reports and dashboards.
Demonstrated experience in handling large data sets and relational databases.
Problem-solving skills.
Accuracy and strong attention to detail.
Demonstrated analytical expertise, including the ability to synthesize complex data.
Understanding client needs and ability to present findings effectively to stakeholders.
Experience with statistical methods: Correlation, Regression Analysis, Factor Analysis, Cluster analysis, Decision Trees, etc.
Knowledge of how to create and apply the most accurate algorithms to datasets to find solutions is a plus.
Ability to collaborate effectively and work as part of a team.
Strong verbal and Written communication skills.
Proficiency in statistics and statistical packages like Excel, Python, R, SPSS, or SAS to be used for data set analyzing (Preferred).
Knowledge of programming languages like SQL, or ETL frameworks. (Preferred).
Knowledge of data visualization software like Tableau, Power BI, etc.
Knowledge in AWS/Azure - Data Warehousing concepts (Data Lake / Azure Data Factory / Azure Synapse).

Company Information