ELEKTRONN/elektronn3

Error with 3D convolution on custom Kernels

Closed this issue · 2 comments

I wanted to add my own 3D kernel on every layer. I tried using this,

`
def conv3(in_channels, out_channels, kernel_size=3, stride=1,
padding=1, bias=True, planar=False, dim=3):
"""Returns an appropriate spatial convolution layer, depending on args.
- dim=2: Conv2d with 3x3 kernel
- dim=3 and planar=False: Conv3d with 3x3x3 kernel
- dim=3 and planar=True: Conv3d with 1x3x3 kernel
"""
if planar:
stride = planar_kernel(stride)
padding = planar_pad(padding)
kernel_size = planar_kernel(kernel_size)

weights = torch.tensor([[[4., 1., 4.],
                    [1., 1., 1.],
                    [4., 1., 4.]], 
                    [[1., 1., 1.],
                    [1., 10., 1.],
                    [1., 1., 1.]], 
                    [[4., 1., 4.],
                    [1., 1., 1.],
                    [4., 1., 4.]]])
# weights
weightsu = weights.view(3, 3, 3).repeat(1, 1, 1, 1, 1)
# weightsu

   
kernel = get_conv(dim)(
    in_channels,
    out_channels,
    kernel_size=kernel_size,
    stride=stride,
    padding=padding,
    bias=bias
)
print(kernel.weight.shape)
with torch.no_grad():
    kernel.weight = nn.Parameter(weightsu)

return kernel

`

But I got an error like this,

Testing 3D U-Net with n_blocks = 1, planar_blocks = ()...
torch.Size([32, 1, 3, 3, 3])
torch.Size([32, 32, 3, 3, 3])

RuntimeError: Given weight of size [1, 1, 3, 3, 3], expected bias to be 1-dimensional with 1 elements, but got bias of size [32] instead

I kindly request you to Help me fix this!!!

@Optiligence
@my-tien
@xeray
@mdraw
@jmrk84

With Random kernels and planar setting, I get output as

Testing 3D U-Net with n_blocks = 1, planar_blocks = ()...

torch.Size([32, 1, 3, 3, 3])

torch.Size([32, 32, 3, 3, 3])

Testing 3D U-Net with n_blocks = 1, planar_blocks = (0,)...

torch.Size([32, 1, 1, 3, 3])

torch.Size([32, 32, 1, 3, 3])

...

...

...

All tests sucessful!
mdraw commented

This issue is not really related to elektronn3. If I understand correctly you want to change the initialization of your weights. Note that the get_conv(dim) call here evaluates to torch.nn.Conv3d Please refer to the PyTorch docs and https://discuss.pytorch.org/ if you need help with changing weights of torch.nn.Conv3d modules.