reachsumit/deep-unet-for-satellite-image-segmentation

Mathematical correctness of prediction and the mean

scimad opened this issue · 3 comments

Firstly there are only 7 cases in your predict.py where I believe you are trying to augment the input image (4 rotations states and two flip states). Shouldn't it be 8 cases?

Next, when taking the average, you have recursively taken average of the current result with the result of previous average? In doing so, doesn't it give more weight to the latest prediction?

For example say we have to take the average of three results which are a, b and c (In your case, you should have 8 results, though you have 7 only).

The average result should be :
(a+b+c)/3

whereas you are doing:
[ (a+b)/2 + c ] / 2,
i.e. (a+b+2c)/4.

Which is not the same as (a+b+c)/3.

What am I missing here?

Hey there!! You're right. It looks like this repo is not the final code that I submitted for my class assignment. I have lost the original files from my Uni cluster. Do you have a pull request for this?

Hi there!
Is it possible to predict without augmenting the input image? I mean to delete this part from the code and run it;
for i in range(7): if i == 0: # reverse first dimension mymat = predict(img[::-1,:,:], model, patch_sz=PATCH_SZ, n_classes=N_CLASSES).transpose([2,0,1]) #print(mymat[0][0][0], mymat[3][12][13]) print("Case 1",img.shape, mymat.shape) elif i == 1: # reverse second dimension temp = predict(img[:,::-1,:], model, patch_sz=PATCH_SZ, n_classes=N_CLASSES).transpose([2,0,1]) #print(temp[0][0][0], temp[3][12][13]) print("Case 2", temp.shape, mymat.shape) mymat = np.mean( np.array([ temp[:,::-1,:], mymat ]), axis=0 ) elif i == 2: # transpose(interchange) first and second dimensions temp = predict(img.transpose([1,0,2]), model, patch_sz=PATCH_SZ, n_classes=N_CLASSES).transpose([2,0,1]) #print(temp[0][0][0], temp[3][12][13]) print("Case 3", temp.shape, mymat.shape) mymat = np.mean( np.array([ temp.transpose(0,2,1), mymat ]), axis=0 ) elif i == 3: temp = predict(np.rot90(img, 1), model, patch_sz=PATCH_SZ, n_classes=N_CLASSES) #print(temp.transpose([2,0,1])[0][0][0], temp.transpose([2,0,1])[3][12][13]) print("Case 4", temp.shape, mymat.shape) mymat = np.mean( np.array([ np.rot90(temp, -1).transpose([2,0,1]), mymat ]), axis=0 ) elif i == 4: temp = predict(np.rot90(img,2), model, patch_sz=PATCH_SZ, n_classes=N_CLASSES) #print(temp.transpose([2,0,1])[0][0][0], temp.transpose([2,0,1])[3][12][13]) print("Case 5", temp.shape, mymat.shape) mymat = np.mean( np.array([ np.rot90(temp,-2).transpose([2,0,1]), mymat ]), axis=0 ) elif i == 5: temp = predict(np.rot90(img,3), model, patch_sz=PATCH_SZ, n_classes=N_CLASSES) #print(temp.transpose([2,0,1])[0][0][0], temp.transpose([2,0,1])[3][12][13]) print("Case 6", temp.shape, mymat.shape) mymat = np.mean( np.array([ np.rot90(temp, -3).transpose(2,0,1), mymat ]), axis=0 ) else: temp = predict(img, model, patch_sz=PATCH_SZ, n_classes=N_CLASSES).transpose([2,0,1]) #print(temp[0][0][0], temp[3][12][13]) print("Case 7", temp.shape, mymat.shape) mymat = np.mean( np.array([ temp, mymat ]), axis=0 )

Hey there!! You're right. It looks like this repo is not the final code that I submitted for my class assignment. I have lost the original files from my Uni cluster. Do you have a pull request for this?

Hey @reachsumit , I could do that but not in a couple of weeks at least.

@fadili97 That's not a good idea I believe.