Training on SUN RGB-D dataset
Closed this issue · 6 comments
Hi,
I want to train ENet model on SUN RGB-D dataset, but I found that the ground truth of each image is not consistent.
I following the source code to load the label of each image with
m = require 'matio'
label = m.load(/path/to/folders/'seg.mat').seglabel
Then, drawing an output image with the label, and making different index label has different color.
But, for example, beds are labelled with different color/index in following images
And other objects have different index in different images.
Also, SUN RGB-D dataset has 38 classes (including unlabelled class), so the index interval should be [0, 37] or [1, 38].
But some seg.mat file has the index number larger than 37 and 38, for example, 45, 46 appeared.
I'm wonder what's going wrong about the ground truths?
Many thanks.
Hey, I found the answer.
Closing the issue.
@Nestarneal could you please give more details on how you produce the results on SUNRGBD ? I try to train the ENet on the SUNRGBD also. But I have encountered following bugs: 1, some images do not exist in the file sunImgPath.tsv (So I skip those images); 2, values in some images go to 143, while they are supposed in [0 38].
Thank you very much
@xiaofanglegoc ,
I have no idea about your first problem because I didn't encounter it.
And for the second problem, you can try
m = require 'matio'
m.use_lua_strings = true
gt = m.load('/path/to/seg.mat')
local_label = gt.seglabel
local_table = gt.names
The indexes in local_label in based on local_table,
so you need to convert the indexes into global indexes before training.
how to convert the indexes into global indexes?
In the SUNRGBDtoolbox.zip
, under Metadata
, there's a file containing all the segmentations. The values are in [0, 37] (aka both are included) - so there is one extra class for "no-class". But is it 0 or 37? Did you find this out?
Thanks!