kentsommer/pytorch-value-iteration-networks

Prebuilt Dataset Generation

Closed this issue · 5 comments

Hello,

I was wondering how you generated the prebuilt datasets that are downloaded when running download_weights_and_datasets.sh, i.e. what were the max_obs and max_obs_size parameters?

Did you follow this file in the original repo?
https://github.com/avivt/VIN/blob/master/scripts/make_data_gridworld_nips.m

Thanks,
Emilio

Hi @eparisotto,

Yes the parameters were set to match Aviv's implementation.

Thanks!

Sorry for re-opening this, but it seems that when I visualized the mazes, the original .mat maze files seemed a lot sparser than the provided .npz files. I regenerated the gridworld using the parameters from https://github.com/avivt/VIN/blob/master/scripts/make_data_gridworld_nips.m:
main(dom_size=[16,16], n_domains=5000, max_obs=40, max_obs_size=1.0, n_traj=7, state_batch_size=1)
in dataset/make_training_data.py. The npz file generated using the above function seems to be different than the prebuilt 16x16 npz mazes (the one I generated is much sparser like the .mat mazes). The mazes generated using the above function are also closer to the .mat files but still slightly different.

Do you know if there are any significant differences in the maze generator between the original code and this reimplementation?

There should be no major differences between this implementation and Aviv's. It is of course possible there is a bug somewhere...

I'll take a look at it and see if I can figure out the cause. It has been a while so perhaps I typo'ed the parameters when I was generating the defaults. Thanks for the report!

Regenerated with matching parameters in 15fefd5