google-research/deeplab2

build coco dataset error!!

Opened this issue · 0 comments

I have collected and labeled a small dataset using labelme and convert the labels to coco json format

i want to make binary semanti segmintation so i have only one class
while building the dataset using build_coco_data.py
i got the error;

Starts processing dataset split train.
Traceback (most recent call last):
File "/home/musDesktop/Seg/deeplab2/data/build_coco_data.py", line 309, in
app.run(main)
File "/home/mus/anaconda3/envs/dl2/lib/python3.9/site-packages/absl/app.py", line 312, in run
_run_main(main, args)
File "/home/mus/anaconda3/envs/dl2/lib/python3.9/site-packages/absl/app.py", line 258, in _run_main
sys.exit(main(argv))
File "/home/mus/Desktop/Seg/deeplab2/data/build_coco_data.py", line 304, in main
_convert_dataset(FLAGS.coco_root, dataset_split, FLAGS.output_dir)
File "/home/mus/Desktop/Seg/deeplab2/data/build_coco_data.py", line 264, in _convert_dataset
segments_dict = _read_segments(coco_root, dataset_split)
File "/home/mus/Desktop/Seg/deeplab2/data/build_coco_data.py", line 134, in _read_segments
annotation_file_name = annotation['file_name']
KeyError: 'file_name'

my data root structure is:

dataset:
train2017
test2017
val2017
annotations
panoptic_train2017.json
panoptic_test2017.json
panoptic_val2017.json

am i missing something ?