reith/deepspeech-playground

Trying to load sample model fails

Closed this issue · 1 comments

When loading with the following command on current source branch and tensorflow trained model downloaded:

python visualize.py --interactive --weights-file pre-trained/45-best-val-weights.h5 --train-desc-file pre-trained/model_45_config.json

during interactive session, I set up model_wrp via commands provided, I receive the error as follows:

Traceback (most recent call last):
  File "visualize.py", line 218, in <module>
    main()
  File "visualize.py", line 205, in main
    args.weights_file)
  File "visualize.py", line 107, in interactive_vis
    model.load_weights(weights_file)
  File "/data/Documents/Projects/TUM/IDP/TabShare/TabShare/venv/lib/python3.5/site-packages/keras/engine/topology.py", line 2619, in load_weights
    load_weights_from_hdf5_group(f, self.layers)
  File "/data/Documents/Projects/TUM/IDP/TabShare/TabShare/venv/lib/python3.5/site-packages/keras/engine/topology.py", line 3068, in load_weights_from_hdf5_group
    str(len(filtered_layers)) + ' layers.')
ValueError: You are trying to load a weight file containing 7 layers into a model with 14 layers.
(venv) ➜  deepsp

What's the correct way of testing out the pretrained model?

reith commented

in your invocation replace --train-desc-file with --model-config