Relja/netvlad

VLAD Parameters

Closed this issue · 7 comments

Hi,
I am trying to look at the VLAD parameters like w, b, c, ie. the centroid of the
cluster etc. in as in the NetVLAD paper.

I downloaded the best model:

load('vd16_pitts30k_conv5_3_vlad_preL2_intra_white.mat')

display( sprintf( '# layers = %d', size(net.layers,2) ) )

for i=1:size(net.layers,2)
    mat = cell2mat(net.layers(i));
    
    if isfield( mat, 'name' )
        ty = getfield( mat, 'type' ) ;   
        name = getfield( mat, 'name' ) ;
        display( sprintf('%d: %s : %s', i, ty, name ) );
    end
end

Running this script gave:

# layers = 35
1: conv : conv1_1
2: relu : relu1_1
3: conv : conv1_2
4: pool : pool1
5: relu : relu1_2
6: conv : conv2_1
7: relu : relu2_1
8: conv : conv2_2
9: pool : pool2
10: relu : relu2_2
11: conv : conv3_1
12: relu : relu3_1
13: conv : conv3_2
14: relu : relu3_2
15: conv : conv3_3
16: pool : pool3
17: relu : relu3_3
18: conv : conv4_1
19: relu : relu4_1
20: conv : conv4_2
21: relu : relu4_2
22: conv : conv4_3
23: pool : pool4
24: relu : relu4_3
25: conv : conv5_1
26: relu : relu5_1
27: conv : conv5_2
28: relu : relu5_2
29: conv : conv5_3
30: normalize : preL2
32: normalize : vlad:intranorm
34: conv : WPCA

So, I tried to look at layer 30 and 32, but I fail to see the learned weights.
I do see the weights for other layers though.
struct with fields:

>>  cell2mat( net.layers(30) )
        type: 'normalize'
        name: 'preL2'
       param: [1024 1.0000e-12 1 0.5000]
    precious: 0

>>  cell2mat( net.layers(32) )
        type: 'normalize'
        name: 'vlad:intranorm'
       param: [1024 1.0000e-12 1 0.5000]
    precious: 0

Am I missing something?

Relja commented

Yes - NetVLAD is not a standard layer so it is not included in MatConvNet by default. It is a custom layer implemented as a class, so you are missing one of these two things (or both):

  1. You need to download the NetVLAD code and make sure it is in the path. Otherwise Matlab won't know how to load the objects from the mat file and will leave the custom layers empty.

  2. Because these layers are implemented as classes, they are not struct's but objects. So your code won't work -not sure what will cell2mat do, and isfield only works for struct's and returns false otherwise https://www.mathworks.com/help/matlab/ref/isfield.html , use isprop for objects https://www.mathworks.com/help/matlab/ref/isprop.html . As far as I know all MatConvNet layers have .name so you don't need to check for that, you probably checked because you found some empty layers due to (1.).

So, where are the weights for the netvlad layer stored? I wish to access those.

Relja commented

If you download the NetVLAD code and make sure it is in the path, loading the network from the mat file will work instead of producing some empty layers. Then you can examine the loaded NetVLAD layer and access it's weights. E.g. net.layers{31}.weights

I run your demo code, computeRepresentation.m and attempt to look at the weights.
under vlad:core I see only 2 weights, both are of dimensions (1, 1, 512, 64). But I was expecting to see 3 sets of weights for the NetVLAD layer as mentioned in your netvlad paper.

Am correct in assuming that the bias is not trained in this case?

Relja commented

Yes, layerVLAD.m code is without bias while layerVLADv2.m is with bias (we mentioned on arXiv appendix that we fix the bias, but it seems we accidentally dropped it from the v3 version of the paper). For the setting in the paper, there is not much difference between the two, and the reason is that input features are L2 normalized in which case you can do the assignment with a simple scalar product and don't need bias (e.g. see the assignment equation 2, expand it and assume |xi|=1 and |ck|=1, then you don't get any bias terms as they all cancel out). If they were not L2 normalized, the two are not similar and layerVLADv2 should probably be used (but this is just theory, I don't know if it is needed in practice).

Ok, I get your point. How about the BatchNormalization? Is this network finetuned from the ImageNet VGG without batchnorm updates?

Relja commented

As you can see from your list of layers - there are no batch norm layers in the original VGG network