codeslake/DMENet

The max_coc of datasets is not 28.

chanwental opened this issue · 1 comments

Hi,
I read a blur_map in your dataset as follows:
image = (np.float32(cv2.imread(file_name, cv2.IMREAD_UNCHANGED))/10.)[:, :, 1].
Then the maximum of image is 15. #15=(61-1)/4
It implys max_coc=61, which is not consistent with 28 in your paper.
What is the problem?

Hi, Wentao

Regarding with your first answer,
I've looked at the initial code and I found the accurate answer.

In the initial SYNDOF, the sigma was computed as:

sigma = (max_coc - 1) / 2.
with initial max_coc = 15.

But after generation of SYNDOF, we found out that the largest blur size was acutally around 28~30.
So, we phisically analyze the blur kernel, and found out;

sigma = (max_coc - 1) / 4

was acuatlly making a blur kernel to have the diameter similar to max_coc.

However, instead of regenerating SYNDOF and retrain DMENet (as we ran out of time for CVPR2019),
we wrote "sigma = (max_coc) / 4" in the paper, and defined our max_coc = 28.
Then, we reflected the equation and refined the SYNDOF generation code for the final release, in which sigma is defined as sigma = (max_coc - 1) / 4.

Please not that that doesn't make any difference in SYNDOF generation process.
With that said, if random seeds are the same, initial_SYNDOF(max_coc=15) generates the same images as the refined_SYNDOF(max_coc = 29).

I hope these answer solves your question.

Best,
Junyong