In this project, we develop three models to predict the membrane potential of giant squid axons over time: a Wiener filter, a reservoir network, and an LSTM. We demonstrate that, although simple linear methods (e.g. a Wiener filter) are insufficient to predict chaotic neural data, a suitably tuned reservoir computer – a network designed to predict the evolution of chaotic systems – as well an LSTM both yield more promising results. Moreover, the reservoir computer requires significantly less training time than the LSTM, suggesting that reservoir networks are best-suited to predict chaotic data.
This repository contains all implementations for our project: a Wiener filter, a reservoir computing network that we built from scratch, a reservoir computing network built using ReservoirPy's framework, and an LSTM built using PyTorch. Additionally, we included scripts that perform that hyperparameter tuning presented in our project, sample hyperparameter trials, and sample predictions on four of the Squid Giant Axon Membrane Potential (SGAMP) database axon trials. Please contact Jake Hofgard (whofgard@stanford.edu), Kai Fronsdal (kaif@stanford.edu), and Shaunak Bhandarkar (shaunakb@stanford.edu) with any questions.