[Bug] Chunks are not correctly formed
Closed this issue · 1 comments
juliendenize commented
Issue
Chunks are not correctly formed for BatchLabels, BatchBoundingBoxes and BatchMasks.
Example for BatchMasks:
def get_chunk(self, chunk_indices: torch.Tensor) -> BatchMasks:
chunk_idx_sample = torch.tensor(
[0] + [self.idx_sample[chunk_indice + 1] - self.idx_sample[chunk_indice] for chunk_indice in chunk_indices]
)
chunk_idx_sample = chunk_idx_sample.cumsum(0).tolist()
return BatchMasks(
self[chunk_indices], # wrong select
idx_sample=chunk_idx_sample,
device=self.device,
requires_grad=self.requires_grad,
)
Suggestions
- Create a _BatchConcatenatedTATensor private class to inherit from
- Fix the select for each classes
- Make tests for such classes
juliendenize commented
Still buggy when there are different chunks selected because of samples_ranges lost during the process.
Suggestions:
- Cover the transform tests
- Test samples_ranges