/SpikingVAE

Master Thesis Project of Justus Hübotter

Primary LanguageJupyter NotebookMIT LicenseMIT

Code style: black

Spiking VAE

Table of content

Description

This is my thesis project for the MSc. Artificial Intelligence program at Vrije Universiteit Amsterdam. I implemented an autoencoder network with spiking neurons in PyTorch.

Author

Justus F. Hübotter

Example

Image reconstruction example

The regularized spiking autoencoder model encodes to and decodes from a spiking latent representation.

Image reconstruction example

Spiking model performs the iamge reconstruction task well under the influence of noisey inputs.

Image reconstruction example

Features

  • CPU/GPU support

  • TensorBoard real-time monitoring

  • Weights and Biases logging

  • Custom loss functions

  • Custom metrics

  • Best and last model weights automatically saved

  • Pretrained weights available

  • Reconstruction & representation plotting

  • Dataset preprocessing options

  • Fully commented and documented

  • MNIST dataset

  • Fashion-MNIST dataset

  • CelebA dataset

  • Bouncing balls dataset

  • Moving MNIST dataset

  • Image-to-spike encoding (rate and time code)

  • Fully parameterized model architectuce

  • Fully connected classifier

  • Convolutional classifier

  • Fully connected spiking classifier

  • Spiking convolutional classifier

  • Fully connected autoencoder

  • Convolutional autoencoder

  • Fully connected variational autoencoder

  • Convolutional variational autoencoder

  • Fully connected spiking autoencoder

  • Convolutional spiking autoencoder

Results

Development and Testing are done on Ubuntu 18.04 with 16 GB RAM, Ryzen 5 3600, nVidia RTX 2070.

Usage

Setup

Requires Python 3.7

The following lines will clone the repository and install all the required dependencies.

$ https://github.com/jhuebotter/SpikingVAE.git
$ cd SpikingVAE
$ pip install -r requirements.txt

This project uses Weights and Biases for logging. In order to use this package, having an account with their platform is mandatory. Before running the scrips, you to login from you local machine by running

$ wandb login

For more information please see the official documentation.

Datasets

In order to download datasets used in the paper experiments use

$ python setup.py

with options mnist and fashion. For example, if case you want to replicate all experiments in this thesis, you must run the following line:

$ python setup.py mnist fashion

It will download and store the datasets locally in the data folder.

Pretrained Models

Are not yet available here.

Train Models

$ cd src
$ python [model] [args] 

For example

$ python train_cnn.py --dataset mnist --epochs 10 --report-interval 1 --lr 0.001 

License

MIT License