NVlabs/Deep_Object_Pose

Pose Annotation not found while trying to generate synthetic data using Blenderpoc

Closed this issue · 10 comments

I am trying to generate synthetic dataset using Blenderproc. But the generated json field for any image doesn't contain any information about location and quaternion_xyzw.

The generated json file looks like this:

{ "camera_data": { "width": 640, "height": 480, "camera_look_at": { "at": [ -0.0, 1.0, -7.549790126404332e-08 ], "eye": [ -0.0, 25.0, -0.0 ], "up": [ 1.0, 0.0, 0.0 ] }, "intrinsics": { "fx": 772.5484890407986, "fy": 772.5484890407986, "cx": 0.0, "cy": 0.0 } }, "objects": [ { "class": "Ketchup", "name": "Ketchup_000", "visibility": 1946, "projected_cuboid": [ [ 224.19110758149313, 116.15143978821277 ], [ 212.46794541428113, 72.44711783175563 ], [ 251.225698693963, 120.73907352209227 ], [ 262.2133609618124, 159.82758583644204 ], [ 192.26477131843527, 128.60732524413984 ], [ 181.3122982956025, 84.92528896550124 ], [ 223.70554564229843, 131.38721399917722 ], [ 234.0947073360387, 170.44839703066373 ], [ 223.9288088155419, 124.3818062835901 ] ] } }

@nv-jeff could you check into this?

The JSON fields location and quaternion_xyzw are not required for training.

Could they be added if they want to use for testing?

pretty please!

Thanks for the reply. It would be useful if it has such information. However is it possible to generate pose labels (quaternion_xyzw and location) using PnP anyhow we have projected cuboid and 3D keypoints. And How accurate would that be?

Hah, yes, it could be added. We originally omitted it in order to keep the Blenderproc code as focused as possible on generating training data, but the information is in there if we need it. I will work on this and check it in when it's done and tested.

Thank you. Looking forward for it.

I have pushed a change which adds these two fields (location and quaternion_xyzw) to the JSON output. I also added a new script, validate_data.py which draws the cuboids using the information in a JSON file.

Note: The order of projected_points has changed in this changelist! I found a transformation error in my code, which rotated the cuboid 180 degrees around the Z axis (the object's vertical axis). Older data is still valid for training, although it will think the back of an object is its front and vice versa, but should not be intermingled with data generated after this change.
The salient change was to the dope_order list on line 218 of generate_training_data.py if you wish to investigate. Note, too, that this change only affects the Blenderproc pipeline; the NVISII data generation scripts have not been changed.

Hey @nv-jeff

I have a question in dataset generation using Blenderproc. In this function in generate_training_data.py at line 182 we are calculating the 3D bounding box of the mesh. In my case I have only one mesh object every time the function is getting called it is printing the different values for the bbox ideally the bbox of the mesh should remain same right?

That function returns the object aligned bounding box coordinates in world coordinates (not in object coordinates), so it will be different if the pose of the object is different.