TianhongDai/hindsight-experience-replay

Unnecessary copying makes sampling time explode

carlo- opened this issue · 2 comments

Hello,
I noticed that the training time increases linearly at each epoch, in fact, it seems to scale up as the buffer size increases. After a bit of research, I think that the reason is the .copy() in this line of the sampling function.

I might have missed something, but it seems that the copying is not needed; the original implementation in baselines also doesn't have this. I removed it and it seems to fix the issue.

@carlo- Hi Carlo, Sorry for reply so late! Thank you so much for the suggestion and I have removed the .copy() term.

Happy to help!