debugging with pdb
wtyuan96 opened this issue · 1 comments
I wanted to debug this program with pdb, I ran command below
python -m pdb train.py \
--dataset_name dtu \
--root_dir $DTU_DIR \
--num_epochs 16 --batch_size 2 \
--depth_interval 2.65 --n_depths 8 32 48 --interval_ratios 1.0 2.0 4.0 \
--optimizer adam --lr 1e-3 --lr_scheduler cosine \
--exp_name exp
and set a breakpoint by
b dataset/dtu.py:148
and continue running the program with typing c
,
Normally,the program should pause on dataset/dtu.py:148
, but actually,the message below was printed and the program was restarted.
The program finished and will be restarted
But, when I set a breakpoint in model/mvsnet.py
,the program paused as expected.
I am confused what is the difference between dataset/dtu.py
and model/mvsnet.py
when debugging with pdb,could you please help me out.
I wanted to debug this program with pdb, I ran command below
python -m pdb train.py \ --dataset_name dtu \ --root_dir $DTU_DIR \ --num_epochs 16 --batch_size 2 \ --depth_interval 2.65 --n_depths 8 32 48 --interval_ratios 1.0 2.0 4.0 \ --optimizer adam --lr 1e-3 --lr_scheduler cosine \ --exp_name exp
and set a breakpoint by
b dataset/dtu.py:148and continue running the program with typing
c
, Normally,the program should pause ondataset/dtu.py:148
, but actually,the message below was printed and the program was restarted.The program finished and will be restartedBut, when I set a breakpoint in
model/mvsnet.py
,the program paused as expected. I am confused what is the difference betweendataset/dtu.py
andmodel/mvsnet.py
when debugging with pdb,could you please help me out.
I solved the problem when setting num_workers=0
which is the default setting of torch.utils.data.DataLoader
.
Thanks.