genn-team/genn

Issues with merging synapse groups where pre or postsynaptic neuron parameters are referenced

Closed this issue · 0 comments

When merging synapse groups, the only requirement on any pre and postsynaptic neuron parameters (or variables) referred to with $(X_pre) or $(Y_post) syntax is that the weight update model code is the same. This is fine for variables as all that happens is something like:

const auto preVars = getArchetype().getSrcNeuronGroup()->getNeuronModel()->getVars();
for(const auto &v : preVars) {
    // If variable is referenced in code string, add source pointer
    if(code.find("$(" + v.name + "_pre)") != std::string::npos) {
        addSrcPointerField(v.type, v.name + "Pre", backend.getDeviceVarPrefix() + v.name);
    }
}

However, with parameters, the code uses the index of parameters rather than their names to e.g. check whether values are the same across all groups being merged which is totally flawed (where index is the index of some parameter in the 'archetype' group i.e. the first one in the merged group):

template<typename P>
bool isParamValueHeterogeneous(size_t index, P getParamValuesFn) const
{
    // Get value of parameter in archetype group
    const double archetypeValue = getParamValuesFn(getArchetype()).at(index);

    // Return true if any parameter values differ from the archetype value
    return std::any_of(getGroups().cbegin(), getGroups().cend(),
                       [archetypeValue, index, getParamValuesFn](const GroupInternal &g)
                       {
                           return (getParamValuesFn(g).at(index) != archetypeValue);
                       });
}

There's nothing to prevent the target neuron groups in a merged synapse group having totally different numbers/orders of parameters. This can result in std::out_of_range exceptions getting thrown (at least it uses std::vector::at...) or the wrong parameters value being used.

Obviously several fixes are possible but, as this will all fixed in GeNN 5.0.0 by #493 which replaces all the vectors holding parameters with maps and all index-based lookups to string-based lookups, I don't think it's worth the effort. Workaround is clear - just don't use $(X_pre) or $(Y_post) syntax to reference neuron parameters from weight update models, duplicate the parameters intead.