Continvvm/continuum

Turning RehearsalMemory into nn.Module

oleksost opened this issue · 5 comments

It would be. nice to turn RehearsalMemory class into nn.Module and make buffers (self._x = self._y = self._t in RehearsalMemory) pytorch buffers (with register_buffer). In this way one could for example load and save the buffers with pytorch functionality, e.g. using .state_dictionary()

But it would impose that x,y,t are pytorch tensors while they are numpy array.

If you only want a possibility to save/load, maybe we could simply use the saving/loading options of Numpy. Or did you more idea in mind for using buffer than just saving/loading?

@oleksost something like this #193 ?

Or did you more idea in mind for using buffer than just saving/loading?

  • I though about this actually when I had to calculate the memory footprint of my model that had the replay buffer as attribute; since the replay buffer was not a pytorch object, I had to do a separate calculation for it.

@oleksost something like this #193 ?

Ok so we could actually add a property to the Memory that will computes the memory footprint.

Only gotcha is that if the images are as paths, we need to open all images to compute their size in-memory, or should it better to compute their size on-disk (thus compressed). An idea @TLESORT ?

I think the memory footprint can be estimate just based on the number of samples saved and the input shape of the model without consideration of true size of images or compression.
Ignoring true size is just that theoretically if we were really saving images in replay buffer, we would save the input and there is no need to save the original images.
And Considering compression is probably too complex.