Skip to content

amosproj/amos2024ws02-backup-metadata-analyzer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Backup Metadata Analyzer

Overview Page

Product Vision

Databackups are an essential functionality for businesses. They ensure, that no important information or artifacts of work is lost due to technical errors or hacking. This metadata analyzer aims to help detect problems or anomalies with backups. This problem detection is done in a timely manor, and in some cases even predictively, to help the customer to react before major damages occur.

Project Mission

This project aims to explore which data analysis methods are applicable to the backup metadata, which is generated by the industry partner´s backup software "SESAM". Furthermore, the results of the analysis are condensed into insights and displayed with graphs and alerts. This will help the customer to better understand their backup data and focus on the important take aways.

Project Structure

Architecture Diagram

The project is structured in three modules:

  • Frontend
  • Backend
  • Analyzer

Frontend

The frontend is developed in TypeScript using Angular and serves to present the analysis of the backup metadata to the user. It includes an overview page for quickly assessing the state of the backups, as well as a detailed backup statistics page and a page for viewing all triggered alerts.

Backend

The backend is written in TypeScript using NestJS and acts as a bridge between the frontend and the analyzer. Its primary function is to store the results from the analysis module, but it also supports additional features such as sending emails when alerts are triggered.

Analyzer

The analyzer is implemented in Python and uses Flask to provide REST APIs for updating backend data and managing various analysis modes. Each analysis mode examines a different aspect of the metadata, using custom algorithms to generate alerts whenever irregularities are detected.

Prerequisites

Make sure the following are installed on your machine:

  • Node 20 with npm
  • Docker
  • Docker Compose
  • Python 3.11 and Poetry

Build and run

Now you have 2 options:

    1. Build and run with docker
    1. Build and run with nx (for local development)

Docker Build Setup Instructions

  1. Clone the repository:

    git clone https://github.com/amosproj/amos2024ws02-backup-metadata-analyzer.git
    
  2. Change directory:

     cd ./amos2024ws02-backup-metadata-analyzer/
    
  3. Setup .env files:

    In the projects root folder, copy the .env.docker.example and rename the copy to .env.docker

  4. Copy database dump into project:

    Copy the database dump .dmp file into the projects root folder and rename it to db_dump.sql

  5. Build Docker container:

     docker compose --env-file .env.docker build --no-cache
    
  6. Start Docker container:

     docker compose --env-file .env.docker up
    
  7. Stop Docker Container:

     docker compose --env-file .env.docker down

Local dev build and run instructions

  • Use npm ci to install local node dependencies.

  • Navigate into the apps/analyzer/metadata_analyzer directory and run poetry install to install python venv and dependencies.

  • In apps/backend and apps/analyzer: copy the .env.example files and rename them to .env.

  • Make sure you have postgres databases running on the connections defined in the .env files.

  • The analyzer database should contain the database with backup metadata to be analyzed.

  • The backend database should initially be empty and is used to store the analysis results.

(Suggestion) Use docker to provide the database(s):

  • If you only want to provide the analyzer database or the backend database via docker, please change the commands accordingly.
  • Prepare the .env.docker file (see step 3 of docker setup enstructions).
  • docker compose --env-file .env.docker build --no-cache backendDatabase analyzerDatabase
  • docker compose --env-file .env.docker up backendDatabase analyzerDatabase

If you have got the databases running:

  • npm run all to run all modules at the same time

If you want to run the modules individually:

  • npm run py to run the python analyzer
  • npm run be to run the Typescript backend
  • npm run fe to run the frontend