Project-MONAI/tutorials

Unrecognized arguments local-rank in multi-gpu train in self_supervised_pretraining

KumoLiu opened this issue · 0 comments

root@6091477:/opt/toolkit/tutorials/monai/self_supervised_pretraining/vit_unetr_ssl/multi_gpu# python -m torch.distributed.launch --nproc_per_node=$(nvidia-smi -L | wc -l) mgpu_ssl_train.py --batch_size=8 --epochs=10 --base_lr=2e-4 --logdir_path=/var/log/ssl_train --json_path=../datalists/tcia/dataset_split.json
/usr/local/lib/python3.10/dist-packages/torch/distributed/launch.py:183: FutureWarning: The module torch.distributed.launch is deprecated
and will be removed in future. Use torchrun.
Note that --use-env is set by default in torchrun.
If your script expects `--local-rank` argument to be set, please
change it to read from `os.environ['LOCAL_RANK']` instead. See 
https://pytorch.org/docs/stable/distributed.html#launch-utility for 
further instructions

  warnings.warn(
[2024-03-15 09:16:37,155] torch.distributed.run: [WARNING] 
[2024-03-15 09:16:37,155] torch.distributed.run: [WARNING] *****************************************
[2024-03-15 09:16:37,155] torch.distributed.run: [WARNING] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. 
[2024-03-15 09:16:37,155] torch.distributed.run: [WARNING] *****************************************
usage: ViT Self-Supervised Learning [--data_root DATA_ROOT] [--json_path JSON_PATH] [--logdir_path LOGDIR_PATH] [--output PATH] [--local_rank LOCAL_RANK] [--epochs EPOCHS] [--batch_size BATCH_SIZE] [--base_lr BASE_LR] [--seed SEED] [--deterministic]
ViT Self-Supervised Learning: error: unrecognized arguments: --local-rank=1
usage: ViT Self-Supervised Learning [--data_root DATA_ROOT] [--json_path JSON_PATH] [--logdir_path LOGDIR_PATH] [--output PATH] [--local_rank LOCAL_RANK] [--epochs EPOCHS] [--batch_size BATCH_SIZE] [--base_lr BASE_LR] [--seed SEED] [--deterministic]
ViT Self-Supervised Learning: error: unrecognized arguments: --local-rank=0
[2024-03-15 09:16:52,174] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 2) local_rank: 0 (pid: 2009) of binary: /usr/bin/python
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/launch.py", line 198, in <module>
    main()
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/launch.py", line 194, in main
    launch(args)
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/launch.py", line 179, in launch
    run(args)
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/run.py", line 825, in run
    elastic_launch(
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/launcher/api.py", line 137, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
  File "/usr/local/lib/python3.10/dist-packages/torch/distributed/launcher/api.py", line 271, in launch_agent
    raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError: 
============================================================
mgpu_ssl_train.py FAILED
------------------------------------------------------------
Failures:
[1]:
  time      : 2024-03-15_09:16:52
  host      : 6091477
  rank      : 1 (local_rank: 1)
  exitcode  : 2 (pid: 2010)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
  time      : 2024-03-15_09:16:52
  host      : 6091477
  rank      : 0 (local_rank: 0)
  exitcode  : 2 (pid: 2009)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================