/Skoltech-ML-2020-AutoEncoder-by-Forest

Comparison of the performance of the autoencoder (AE) forest algorithm with MLP & CNN AEs.

Primary LanguageJupyter Notebook

Skoltech-ML-2020-AutoEncoder-by-Forest

Typically, autoencoders (AEs) are assotiated with Neural Networks. Yet in the paper the authors propose to use Decision Tree for AE and claim that their approach has reasonable performance. Here we have reproduced the paper results: we have implemented AE forest algorithm and compared its performance with MLP & CNN AEs on image datasets (MNIST, CIFAR-10, Omniglot).

The code was written by:

  • Egor Sevriugov - Tree ensemble based AE (MNIST, CIFAR-10, Omniglot),
  • Kirill Shcherbakov - CNN based AE (MNIST, CIFAR-10, Omniglot),
  • Maria Begicheva - MLP based AE (MNIST, Omniglot),
  • Olga Novitskaya - MLP based AE (CIFAR-10, Omniglot)

AEbyForest: Project | Paper | Report | Presentation | Video

Train MNIST/Test MNIST

Train CIFAR10/Test CIFAR10

Colab Notebook

Prerequisites

  • Python 3
  • Google Colaboratory service
  • PyTorch 1.4.0, Tensorflow 2.1.0, Keras 2.3.0

Datasets info

How to launch the code?

To help users better understand and use our code, for each model we created instructions for running the code and reproducing the results:

Related Projects

  • The official implementation for the paper "AutoEncoder by Forest" by Ji Feng and Zhi-Hua Zhou 2017: Paper | Code
  • Non-official implementation of the paper "AutoEncoder by Forest" by Ji Feng and Zhi-Hua Zhou 2017 by Antoine Passemiers: Paper | Code