kxhit/vMAP

Testing vMAP on intel realsense RGB-D Camera.

random-guest opened this issue · 2 comments

Let me start by congratulating you for the great work and thanking you for the well-organized and easy-to-follow repo.

I am wondering about the steps to test vMAP on a live stream data from an intel realsense camera or a Microsoft Kinect camera.

Any suggestions or recommendations will be highly appreciated.

Regards,

kxhit commented

@random-guest Hi thanks for your interest in our work!

Here are the steps for setting up a live demo.

  1. Get ORB-SLAM or any odometry system running and send the latest frame (with estimated pose) and up-to-date pose (after BA) to vMAP via ROS node . [RGB, Depth, Pose]
  2. Get an instance segmentation mask (run Detic or any off-the-shelf network) and associate it with tracked IDs. [Mask]
  3. Now we get enough info [RGB, Depth, Pose, Mask] to run vMAP system, and vMAP will automatically init an MLP model for each instance ID and keep updating its reconstruction. Please note the reconstruction results will highly rely on the tracking performance as we assume a fairly stable front-end, though multi-view supervision is already alleviated it, which could be interesting future work.

Thank you very much for the fast and detailed reply!!

Yes, that would be interesting to look at !