Required Skills

Microstrategy Automation Linux

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 2nd Sep 2022

JOB DETAIL

 

KEY RESPONSIBILITIES

  • Leveraging and applying new and emerging practices for Enterprise Data Management utilizing Data Platforms
  • Design, plan, and develop programs to perform automated extract, transform and load data between data sources
  • Implement process improvements (Automation, Performance tuning, Optimize workflows)
  • Develops and/or executes implementation according to the project plans and priorities


Qualifications

ESSENTIAL EXPERIENCE

  • Minimum 5+ years of professional software development experience.
  • Object-oriented programming and component-based development with Java/Python.
  • Experience working with both structured and unstructured data, with a high degree of SQL knowledge.
  • Experience designing and implementing scalable ETL/ELT processes.
  • Experience in modeling data for low latency reporting.
  • Expertise with Big Data ecosystem services, such as Spark/SparkSQL, Kafka, Apache NiFi, Hadoop/Hive, MongoDB, HBase, and Kafka Streams.
  • Experienced in Modern Big Data Analytics using Data Lakes, Spark, and different file formats like Avro, JSON, and Parquet.
  • Experience working with large cloud data lakes.
  • Hands-on experience with Databricks, building data pipelines.
  • Performance tuning, troubleshooting and diagnostics, process monitoring, and profiling.
  • Ability to work in a fast-paced environment with evolving requirements and capability goals.
  • Appetite for learning new & emerging technologies.
  • Strong problem-solving skills and communication skills.
  • Self-motivated, driven, and independent individual.

PREFERRED EXPERIENCE

  • Prior experience with any cloud stack, preferably Azure.
  • Prior experience with open source Apache projects.
  • Understanding of reporting tools such as Power BI/MicroStrategy/Tableau etc.
  • Experience with orchestrating complex data pipelines using tools like Azure Data Factory or Apache Airflow.
  • Experience with large-scale data processing, complex event processing, stream processing.
  • Experience working with CI/CD pipelines, source code repositories, and operating environments.
  • Understanding containerization, virtualization, and cloud computing.
  • Experience working in the Scrum Agile software development framework.
  • Demonstrated expertise in Linux.

Company Information