junku901/machine_learning

How to set a suitable 'hidden_layer_sizes'?

Opened this issue · 3 comments

My data have one output only, when i set the output to one , a error found :

throw new Error('Matrix mismatch.');

Is it the issue of hidden layer? And how to solve it ?

Sorry for really*10 late reply. Basically all of the data handled by this library are matrices(2-dimensional arrays). Therefore, the label vector should be matrix even if number of output is one. (e.g [[1],[0]])
The variable 'hidden_layer_sizes' is the number of hidden unit in each hidden layer. (e.g [3,4,2] indicates the MLP with structure 'input'- '3 unit hidden layer' - '4 unit hidden layer' - '2 unit hidden layer' - output.

Hello,
given

var x = [[0.255,0.255,0.255],
         [0,0,0],
         [0.250,0.132,0.14],
         [0.20,0.20,0],
         [0.40,0.40,0.10],
         [0.100,0.100,0.100]];
var y = [[1],
         [0],
         [0.65],
         [0.7],
         [0.14],
         [0.39]];

var mlp = new ml.MLP({
    'input' : x,
    'label' : y,
    'n_in' : 3,
    'n_out' : 1,
    'hidden_layer_sizes' : [3,1]
});

which will be the right hidden_layer_sizes ? And why?

Thanks

I am having the same issue. I'm getting an array mismatch. I'm not sure how order the number of hidden layers. I have 10 different independent variables with varying data points. My Y is [1,0] or [0,1]. So do I select a hidden layer size of [10,2] or do i need to do something like [10, {max number of data points}, 2] ? Right now this is blocking me from using this library.