NVIDIAGameWorks/kaolin

Asking support for linear layer in spc features.

sparse-mvs-2 opened this issue · 0 comments

The method of kaolin.ops.spc.Conv3d is great to handle batches of spc features. However, when I want to write a self-attention layer, I need to use nn.linear. That means, have to transfer the the spc.features of shape (L,C) to (B, l_max, C) with padding. Is there any elegant way to handle this without padding and transfer the features to (B, l_max, C)?