[BUG] APEX and CUDA required for fused optimizers
Hosta-Daniel-Suh opened this issue · 1 comments
Describe the bug
I keep getting "AssertionError: APEX and CUDA required for fused optimizers" error message when I try to run a training code.
Could you suggest how to solve this issue?
To Reproduce
Steps to reproduce the behavior:
I install apex amp using the following command.
'''
git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" ./
'''
These are the commands that I used to run training.
-
"./distributed_train.sh 1 /data --model tf_efficientdet_d0 -b 16 --amp --lr .09 --warmup-epochs 5 --sync-bn --opt fusedmomentum --model-ema"
-
"./distributed_train.sh 1 /data --model tf_efficientdet_d0 -b 16 --lr .09 --warmup-epochs 5 --sync-bn --opt fusedmomentum --model-ema"
-
"./distributed_train.sh 1 /data --model tf_efficientdet_d0 -b 16 --apex-amp --lr .09 --warmup-epochs 5 --sync-bn --opt fusedmomentum --model-ema"
I also tried to modify the default argument by changing False to True. But didn't worked that way.
Desktop (please complete the following information):
- OS: Ubuntu 18.04
don't use the fused optimizers, or install apex