google-deepmind/reverb

TrajectoryDataset, batch and iterator

mikygit opened this issue · 3 comments

Hello,
If i create a batch fron a trajectory_dataset once and for all, would iterating through the batch (using as_numpy_iterator) take into account new samples added (after the instanciation of the batch) into the reverb buffer ?

dataset = reverb.TrajectoryDataset.from_table_signature(...)
batch = dataset.batch(...)
//
// data added
//

iter = batch.as_numpy_iterator()
for i in range(xxx):
samples = iter.next() # ==> would the samples contain the ones added after the creation of the batch?

Thanx.

Anybody?

Hi Mike, yes - new entries added to Reverb Table should show up in your dataset. All batch() does is it puts multiple dataset elements into a single element, but the dataset pipeline still samples for a new data from an underlying data source. This question isn't Reverb specific, see tf.Data.Dataset.batch

cheers.