julianser/hed-dlg-truncated

Model Sampling & Testing

Opened this issue · 0 comments

To generate model responses using beam search run:

THEANO_FLAGS=mode=FAST_RUN,floatX=float32,device=gpu python sample.py <model_name> <model_outputs> --beam_search --n-samples= --ignore-unk --verbose
i am confused abour context ,beam search and verbose.can u give example from where i can get these three parameters.
while model name is automticaly generated during training that i can get already.
thanks.