f90/Wave-U-Net-Pytorch

Cannot be used out of the box without GPU

Lumrenion opened this issue · 1 comments

As my device does not have an NVIDIA GPU (it is equipped with a Radeon), I had to disable CUDA. But when removing --cuda, I get an exception:

Using valid convolutions with 97961 inputs and 88409 outputs
Loading model from checkpoint /code/checkpoints/waveunet/model
Traceback (most recent call last):
  File "/code/predict.py", line 73, in <module>
    main(args)
  File "/code/predict.py", line 23, in main
    state = utils.load_model(model, None, args.load_model)
  File "/code/utils.py", line 140, in load_model
    checkpoint = torch.load(path)
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 529, in load
    return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 702, in _legacy_load
    result = unpickler.load()
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 665, in persistent_load
    deserialized_objects[root_key] = restore_location(obj, location)
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 156, in default_restore_location
    result = fn(storage, location)
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 132, in _cuda_deserialize
    device = validate_cuda_device(location)
  File "/usr/local/lib/python3.6/site-packages/torch/serialization.py", line 116, in validate_cuda_device
    raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

To solve this, I had to change Line 140 of utils.py from

    checkpoint = torch.load(path)

to

    checkpoint = torch.load(path, map_location='cpu')

This should be done automatically when removing the --cuda flag.

f90 commented

Thanks for the report! Fixed