dnouri/cuda-convnet

error when I tried this code in linux...

Closed this issue · 7 comments

When I replaced the codes in layer.py like the below...

    dic, name = self.dic, self.dic['name']
    dic['dropout'] = 0.0
    if name in mcp.sections():
        dic['dropout'] = mcp.safe_get_float(name, 'dropout', default=0.0)

it shown the error message which's about the 'name' in dic is not defined..
well, I changed the codes in layer.cu, layer.cuh, convert.cu, layer.py after comparing files with the original cuda-convnet codes.

How to solve this issues? please give me the tips..

Sorry, no idea what you're trying to do. You replaced the codes? How and why?

I work on cuda 4.0 with cuda-convnet.
so that's why your touched cuda-convnet project can't be built.
It seems that your project is based on cuda 6.0.. so that's why I just tried to put the codes related with dropout into my codes. Unfortunately I couldn't replace the version of cuda to 6.0 from 4.0 yet.

In fact, there is no 'name' key in self.dic when I faced error.
Where is 'name' key supposed to be existed?

Is there any solution for me? Please give me a tip for this case.
Thanks.

Aha! Well then try to use an older version from before the project was upgraded to use CUDA 5: d97cf37

Oh, I solved with your hints, thanks.
Well I have one more question about the position of dropout in layer param.

In fact, I use conv, pool, fc layers in my net.
What is the efficient way which layer params have dropout?
Only for full connected layers? Actually I use 3 fc layers..
Please give me the hint for this.
Thanks.

The README links to two papers which I suggest you take a look at. And yes, people will usually apply dropout to fully connected layers. A good method seems to be to make your net learn well and overfit first, and then apply dropout.

Thanks Daniel.

@teddykspark , can you give me some details of how you deal with the cuda 4.0? My cuda version is cuda 4.2.. Thanks!!