catniplab/vlgp

Rates fall back to zero when spike occurs

Closed this issue · 1 comments

Every time there is spike sample('y') in the tutorial the rate (sample('rate')) does not continue normally but falls back to zero.

Shoudn't the rate not be affected by the individual spikes since they are sample from the rates?

The problem probably lies in
eta = x[m, t, :] @ a + np.einsum('ij, ji -> i', h[:, m, t, :], b) (line 111 in simulation.py)
which I do not fully understand.

@zuluca We add suppressive history-filters in simulation to implement the refractory period of neurons. If a neuron fires a spike, it will not fire again in a few bins.

In Line 111, eta is the log firing rate vector of all neurons at m-th trial t-th bin. It is just a log-linear model with latent factors (left summand) and covariates (right summand). The covariates may contain the bias term (base log firing rate), spike history and external variables. We use a tensor and Einstein summation here for compactness since each neuron has its distinct spike history. It could be simplified to a vector if you only need the bias term.