/feedback-networks

The repo of Feedback Networks, Zamir et al.

Primary LanguageLuaMIT LicenseMIT

Paper: Feedback Networks, CVPR 2017.

Amir R. Zamir*,Te-Lin Wu*, Lin Sun, William B. Shen, Bertram E. Shi, Jitendra Malik, Silvio Savarese.

Feedback Networks training in Torch

============================

Requirements

Code adopted and modified from fb.resnet.torch. See the installation instructions for a step-by-step guide.

If you already have Torch installed, update nn, cunn, and cudnn.

Training

The training scripts come with several options, which can be listed with the --help flag.

th main.lua --help

To run the training, see the example run.sh, explanations below:

th main.lua -seqLength [number of feedback iterations] -sequenceOut [true for feedback false for recurrence inference] -nGPU [number of GPU]
-depth [20 to bypass] -batchSize [batch size] -dataset [cifar100] -nEpochs [number of epochs to train]
-netType [the model under models/ directory] -save [checkpoints directory to save the model] -resume [checkpoints directory to restore the model]

Testing

To run the testing, simply assign a directory of where the checkpoints are saved and turn of the testOnly flag and specify the model path as follows:

-testOnly 'true' -resume [checkpoints directory to restore the model]

Using your own criterion

You can write your own criterion and store it under the directory lib/, and require them in the models/init.lua Add another options in the opts.lua to use them while running a script, for example

cmd:option('-coarsefine', 'false', 'If using this criterion or not')
opt.coarsefine = opt.coarsefine ~= 'false'

In the bash script add

-coarsefine 'true'

Writing your own model

You can develop your own model and store in under models/, as an exmaple model of ours, models/feedback_48.lua Modify the code below the following lines within the code block, and set the netType in your running bash script or command to the name of the model you develop:

elseif opt.dataset == 'cifar100' then
   -- Model type specifies number of layers for CIFAR-100 model