pieterblok/boxal

initial annotation files got deleted!

pkhateri opened this issue · 4 comments

Hello @pieterblok,

I just tried a test run to see how the different output folder look like and noticed a strange behavior.
The boxal.yaml file is similar to the one in the main repo except for the followings:

initial_datasize: 351
pool_size: 2
loops: 1
use_initial_train_dir: True
network_config: "COCO-Detection/faster_rcnn_R_50_FPN_3x.yaml"
pretrained_weights: "weights_2/exp1/uncertainty/model_final.pth"

As the result:

  • I got 180 images selected to be annotated in the next round instead of 2.
  • The annotation json files in the initial_train folder are gone! They are not moved to another folder. It seems they are just deleted.
  • A new folder called annotation is created inside the initial_train folder!

Do you know why this happens? I already tried the code with similar parameters, only with larger pool_size and it was ok. I thought the reason might be the small pool_size.

After cleaning the folders, I reran the program with the same parameters and it did not repeat the same issue as above. However, it did not converge either:
FloatingPointError: Predicted boxes or scores contain Inf/NaN. Training has diverged.

Your help would be much appreciated!

@pkhateri well that is something I haven't experienced before. When the use_initial_train_dir is set to True then it searches for json files associated with your initial training images.

The images and annotations from the initial_train_dir are moved to the traindir folder, because from that the further training and sampling continues. Have you looked in your traindir folder (that you've specified in your config file)?

The training problem probably means that your learning rate is too high, try to reduce it and see if it resolves.

Sorry for the delay! Will be back to you soon!

Sorry for the delay! Will be back to you soon!

Have these problems been solved?

I could not reproduce this problem. Please close this issue. If it happens again, I will write to you with more details.
Thanks very much!