faustomilletari/VNet

evaluation

YeShen1995 opened this issue · 1 comments

Hello, your work is excellent, I am very interested in doing further research, can you also provide evaluation code? Thank you!!!

I was confused on that as well.

here is my understanding: (pls correct me if anybody has insight on this)

The VNET was for PROMISE12 competition. this competition doesn't share test GroundTruth and evaluation code to anybody (just has some text description). the only way to get a scoring is submitting your result to PROMISE12 website.

of course you can try to write evalution code by yourself by referring to the evaluation guide below:

"The metrics were calculated as follows:
The Dice coefficient: two times the amount of overlapping voxels between the reference segmentation and the submitted results, divided by the sum of the amount of voxels in both the reference segmentation and the submitted result.
Relative absolute volume difference: amount of voxels in the submitted result divided by the amount of voxels in the reference standard. The result minus 1 and times 100 is the relative volume difference. The absolute relative volume difference was used as a metric.
Average boundary distance and 95th percentile Haussdorf distance: extract the surface of the submitted result and the surface of the reference standard. Calculate a distance map (using itkDanielsonDistanceMap) to the surface for the reference standard, mask it with the surface of the submitted result and calculate a histogram of the resulting distances. Again create a histogram of the remaining distances and pick the largest average and 95th percentiles out of the two histograms."