Mr-TalhaIlyas/EMPatches

Bug with merging torch tensors

Closed this issue · 1 comments

Hey , So i discovered an error inside of empatches.py merge_batch function in which you turn the m_patches into a numpy array but attempt to permute them after. The fix for this is to use is to substitute permute for transpose.

m_patches = []
for p, i in zip(b_patches, b_indices):
m = super().merge_patches(p, i, mode)
m_patches.append(m)

    m_patches = np.asarray(m_patches)
    
    if self.typ == 'torch':
        m_patches = m_patches.transpose(0,3,2,1)

    return m_patches

Thanks for noticing i fixed it in the repo but forgot to update the package.