j96w/MimicPlay

Training low-level controllers using real-world datasets

Opened this issue · 1 comments

using obs modality: low_dim with keys: ['robot0_eef_pos', 'robot0_gripper_qpos', 'robot0_eef_quat']
using obs modality: rgb with keys: ['agentview_image', 'robot0_eye_in_hand_image']
using obs modality: depth with keys: []
using obs modality: scan with keys: []

============= Loaded Environment Metadata =============
obs key agentview_image with shape (120, 120, 3)
obs key robot0_eef_pos with shape (1, 4)
run failed with error:
"Unable to synchronously open object (object 'robot0_eef_quat' doesn't exist)"

Traceback (most recent call last):
File "scripts/train.py", line 371, in main
train(config, device=device)
File "scripts/train.py", line 78, in train
shape_meta = FileUtils.get_shape_metadata_from_dataset(
File "/home/lhy/robomimic/robomimic/utils/file_utils.py", line 144, in get_shape_metadata_from_dataset
initial_shape = demo["obs/{}".format(k)].shape[1:]
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/home/lhy/anaconda3/envs/mimicplay/lib/python3.8/site-packages/h5py/hl/group.py", line 357, in getitem
oid = h5o.open(self.id, self.e(name), lapl=self.lapl)
File "h5py/objects.pyx", line 54, in h5py.objects.with_phil.wrapper
File "h5py/objects.pyx", line 55, in h5py.objects.with_phil.wrapper
File "h5py/h5o.pyx", line 241, in h5py.h5o.open
KeyError: "Unable to synchronously open object (object 'robot0_eef_quat' doesn't exist)"
Hello, when I used the real-world data example you provided for training, I was able to successfully train the high-level planner. However, when I used the trained high-level planner to train the low-level controller, I encountered the above error. The explanation is that the hdf5 dataset file obtained using the video data you provided is missing the "robot0_eef_quat" key. The command I used for training is "python scripts/trainpy -- config config/low-level. json -- dataset'datasets/playdata/demo
hand
oc_1_new. hdv5 '- bddl_2' scripts/bddl_2/KITCHEN-SCENE9
eval task-1_turn_on \ Stove_put
on \ ntove
on \ nshelf. bddl '- videosprompt'datasets/eval task-1_turn_on \ Stove
ut
on \ s_on \ nshelf. bddl' tove_put-bowl-on_sthelf/image_demo. hdf5 '". Do you know how to solve it?

The humanplay data don't contain the ''robot0_eef_quat' key,which is not suitable to train a lowlevel controller.Also,it's convenient to use a hdf5 viewer to check hdf5 datasets.