lisa-lab/pylearn2

CompositeLayer fails if number of inputs is larger than number of layers

Closed this issue · 1 comments

In my MLP model I have a composite space with two components as input:

input_space: !obj:pylearn2.space.CompositeSpace {
  components: [
    !obj:pylearn2.space.VectorSpace {
      dim: 50, 
    },
    !obj:pylearn2.space.IndexSpace {
      dim: 1, 
      max_labels: 100
    }
  ]
},
input_source: ['features', 'additional_stuff'']

In my model, I want to discard the second component, so I use a CompositeLayer with only one layer, and specify that input 0 is routed to layer 0 and input 1 to nothing:

!obj:pylearn2.models.mlp.CompositeLayer {
      layer_name: 'composite_layer',
      inputs_to_layers: {0: [0], 1: []},
      layers: [
        !obj:pylearn2.models.mlp.RectifiedLinear {
                       layer_name: 'h0',
                       dim: 300,
                       sparse_init: 15,
                       max_col_norm: 5.0,
        },
      ],
    },

This causes an assert at https://github.com/lisa-lab/pylearn2/blob/master/pylearn2/models/mlp.py#L4071 to fail. The assertion is invalid, as it checks, whether the ID of an input is less than the number of layers.

I think the number of actual inputs is not known at that point, so I would recommend to just remove that line (works according to my experiments).

I think a different check should happen in CompositeLayer.set_input_space(). Here the model can actually confirm that all of the keys are smaller than the number of components in the input space.

I agree that it doesn't make sense to assert that the value of the key is less than the number of layers. Your example is one instance of this and one could also build a network where a bunch of inputs go into a nested CompositeLayer and again the number of keys could be larger that the number of layers.