by Roman Pogodin, Dane Corneil, Alexander Seeholzer, Joseph Heng, Wulfram Gerstner
Work accepted to Real Neurons and Hidden Units: Future directions at the intersection of neuroscience and artificial intelligence at NeurIPS 2019
We experiment with stability and efficiency of local learning rules in the reservoir computing setting, when the reservoir is provided with an additional stabilizing input.
experiments.py
-- experiments with both networksplotting.py
-- plotting tools for the experimentssimulation_interface.py
-- interaction with both networksreservoir_net.py
-- a reservoir of neuronsattractor_net.py
-- a rate network with attractor dynamicsutils.py
-- a set of function not directly related to experiments