It's time to bring deep learning and neuroscience together. With this library, we offer machine learning tools to neuroscientists and we offer neuroscience tools to computer scientists. These two domains were created to be one!
NeuroTorch was developed to be easy to use and you can do simple things with few lines of code. Moreover, NeuroTorch is modular so you can adapt it to your needs relatively quickly. Thanks and stay tuned, because more is coming!
What can be done with NeuroTorch in the current version?
- Image classification with spiking networks.
- Classification of spiking time series with spiking networks.
- Time series classification with spiking or Wilson-Cowan.
- Reconstruction/Prediction of time series with Wilson-Cowan;
- Reconstruction/Prediction of continuous time series with spiking networks.
- Backpropagation Through Time (BPTT).
- Truncated-Backpropagation-Through-Time (TBPTT).
- Learning Algorithm: Eligibility-Propagation.
- Anything you are able to do using the modules already created.
- Reinforcement Learning.
- Learning Algorithm: RLS (Recursive Least Squares).
Method | Commands |
---|---|
PyPi | pip install neurotorch |
source | pip install git+https://github.com/NeuroTorch/NeuroTorch |
wheel | 1.Download the .whl file here; 2. Copy the path of this file on your computer; 3. pip install [path].whl |
To install the last unstable version, you can install it by downloading the last version of the .whl file and following the instructions above.
See the readme of the tutorials folder here.
Tutorial | Project | Description |
---|---|---|
Jupyter Notebook | Repository | Image classification with spiking networks (Mnist/Fashion-Mnist). |
Jupyter Notebook | Repository | Time series classification with spiking networks (Heidelberg). |
Jupyter Notebook | Repository | Time series forecasting with spiking networks (Neuronal activity) Sorry, it's a work in progress, so it's not publish yet. |
Jupyter Notebook | Null | Time series forecasting with Wilson-Cowan (Neuronal activity). |
import neurotorch as nt
import torch
import pprint
n_hidden_neurons = 128
checkpoint_folder = "./checkpoints/checkpoint_000"
checkpoint_manager = nt.CheckpointManager(checkpoint_folder)
dataloaders = get_dataloaders(
batch_size=256,
train_val_split_ratio=0.95,
)
network = nt.SequentialRNN(
layers=[
nt.LIFLayer(
input_size=nt.Size(
[
nt.Dimension(None, nt.DimensionProperty.TIME),
nt.Dimension(dataloaders["test"].dataset.n_units, nt.DimensionProperty.NONE)
]
),
output_size=n_hidden_neurons,
use_recurrent_connection=True,
),
nt.SpyLILayer(output_size=dataloaders["test"].dataset.n_classes),
],
name=f"Network",
checkpoint_folder=checkpoint_folder,
).build()
learning_algorithm = nt.BPTT(optimizer=torch.optim.Adam(network.parameters(), lr=1e-3))
trainer = nt.ClassificationTrainer(
model=network,
callbacks=[checkpoint_manager, learning_algorithm],
verbose=True,
)
training_history = trainer.train(
dataloaders["train"],
dataloaders["val"],
n_iterations=100,
load_checkpoint_mode=nt.LoadCheckpointMode.LAST_ITR,
)
training_history.plot(show=True)
network.load_checkpoint(checkpoint_manager.checkpoints_meta_path, nt.LoadCheckpointMode.BEST_ITR, verbose=True)
predictions = {
k: nt.metrics.ClassificationMetrics.compute_y_true_y_pred(
network, dataloader, verbose=True, desc=f"{k} predictions"
)
for k, dataloader in dataloaders.items()
}
accuracies = {
k: nt.metrics.ClassificationMetrics.accuracy(network, y_true=y_true, y_pred=y_pred)
for k, (y_true, y_pred) in predictions.items()
}
pprint.pprint(accuracies)
On the one hand, neuroscientists are increasingly using machine learning (ML) without necessarily having the expertise to create training pipelines. On the other hand, most ML experts lack the neuroscience background to implement biologically inspired models. There is thus a need for a tool providing a complete ML pipeline with features originating from neuroscience while using a simple and intuitive interface.
The goal of this work is to provide a Python package, NeuroTorch, offering a flexible and intuitive training pipeline together with biologically-constrained neuronal dynamics. This tool will include several learning strategies highly used in both ML and neuroscience to ensure that both fields can benefit from the package.
- Norse is a highly optimized spiking neural network library for PyTorch. In fact,
this library seems to be very similar to NeuroTorch at first glance. However, the main difference is that NeuroTorch
is focused on the development of learning algorithms for spiking neural networks and other bio-inspired dynamics like
Wilson-Cowan, while Norse is focused on the development of spiking neural networks layers itself. In addition,
NeuroTorch will soon allow to easily use modules from Norse. - SpyTorch presents a set of tutorials for training SNNs with the surrogate gradient approach SuperSpike by F. Zenke, and S. Ganguli (2017). In fact, the prefix 'Spy' of certain layers in NeuroTorch is a reference to SpyTorch.
- PySNN is a PyTorch extension similar to Norse.
- Pytorch Lightning is a deep learning framework to train, deploy, and ship AI products Lightning fast.
- Poutyne is a simplified framework for PyTorch and handles much of the boilerplating code needed to train classical neural networks.
This package is part of a postgraduate research project realized by Jérémie Gince and supervised by Simon V Hardy and Patrick Desrosiers. Our work was supported by: (1) UNIQUE, a FRQNT-funded research center, (2) the Sentinelle Nord program of Université Laval, funded by the Canada First Research Excellence Fund, and (3) NSERC.
- Documentation at https://NeuroTorch.github.io/NeuroTorch/.
- Github at https://github.com/NeuroTorch/NeuroTorch/.
- Anthony Drouin who helped develop the Wilson-Cowan application during his 2022 summer internship and who is now a collaborator of the project.
- Antoine Légaré and Thomas Charland who made the awesome logo of NeuroTorch.
- To my dog Chewy who has been a great help during the whole development.
@misc{Gince2022,
title={NeuroTorch: A Python library for machine learning and neuroscience.},
author={Jérémie Gince},
year={2022},
publisher={Université Laval},
url={https://github.com/NeuroTorch},
}