Can I train the model with multi GPUs?
wiwengweng opened this issue · 2 comments
wiwengweng commented
Hi, I have two Nvidia cards, but when training the model, I see only one card is in use, although both cards' GPU memory is occupied.
root@bogon:~/DeblurGAN# nvidia-smi
Wed Aug 8 13:59:21 2018
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.67 Driver Version: 390.67 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 108... Off | 00000000:02:00.0 Off | N/A |
| 53% 58C P2 222W / 250W | 10793MiB / 11178MiB | 85% Default |
+-------------------------------+----------------------+----------------------+
| 1 GeForce GTX 108... Off | 00000000:82:00.0 Off | N/A |
| 47% 25C P8 10W / 250W | 10631MiB / 11178MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
ravikt commented
You can use Keras's multi-gpu model.
wiwengweng commented
@ravikt Thanks!