/DistillationAD

Project that regroup the state-of-the-art knowledge distillation approaches for unsupervised anomaly detection

Primary LanguagePython

🏭⚗ Distillation Industrial Anomaly Detection ⚗🏭

Description

This project aims to regroup the state-of-the-art approaches that use knowledge distillation for unsupervised anomaly detection. The code is designed to be understandable and simple to allow custom modifications.

Getting Started

You will need Python 3.10+ and the packages specified in requirements.txt.

Install packages with:

pip install -r requirements.txt

Base usage

Configuration

To use the project, you must configure the config.yaml file This file allows configuring the main elements of the project.

  • data_path (STR): The path to the dataset
  • distillType (STR): The type of distillation : st for STPM, rd for reverse distillation, ead for EfficientAD, dbfad for distillation-based fabric anomaly detection, mixed for mixedTeacher, rnst/rnrd for remembering normality (forward/backward), sn for singlenet
  • backbone (STR): The name of the model backbone (any CNN for st, only resnets and wide resnets for rd, small or medium for ead)
  • out_indice (LIST OF INT): The index of the layer used for distillation (only for st)
  • obj (STR): The object category
  • phase (STR): Either train or test
  • save_path (STR): The path to save the model weights
  • training_data(YAML LIST) : To configure hyperparameters (epochs, batch_size, img_size, crop_size, norm and other parameters)

An example of config for each distillType is accessible in configs/

Training and testing

Once configured, just do the following command to train or test (depending of configuration file)

python3 trainNet.py

You can also visualize the feature map of a given layer, you may change the selected layer within the python file

python3 visualization.py

Implemented methods

STPM : Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection

Article
Code inspiration

Reverse distillation : Anomaly Detection via Reverse Distillation from One-Class Embedding

Article1 and Article2
Code inspiration

EfficientAD : Accurate Visual Anomaly Detection at Millisecond-Level Latencies

Article
Code inspiration

DBFAD : Distillation-based fabric anomaly detection

Article
Code inspiration

MixedTeacher : Knowledge Distillation for fast inference textural anomaly detection

Article
Code inspiration

Remembering Normality: Memory-guided Knowledge Distillation for Unsupervised Anomaly Detection

Article
Code inspiration

SingleNet Single-Layer Distillation with Fourier Convolutions for Texture Anomaly Detection

WACV 2025, article not published yet

Implemented tools

SSPCAB : Self-Supervised Predictive Convolutional Attentive Block for Anomaly Detection

Article
Code inspiration

License

This project is licensed under the MIT License.