dji-sdk/Guidance-SDK-ROS

Modifying API to better use existing ROS tools

Closed this issue · 2 comments

Thanks for adding the camera calibration parameter access in 1.3.7.
Now that we have access to these params, one could use http://wiki.ros.org/stereo_image_proc effectively to get point clouds, etc quickly.

What could be useful would be to have camera info topics published and existing topics renamed to

  • /guidance/right/image_raw, instead of /guidance/right_image
  • /guidance/right/camera_info, got from the new stereo_cali struct.
  • etc, for reusing them with existing tools without having to make changes

Another much appreciated feature would be calibration via ROS. It would be nice if one could use http://wiki.ros.org/camera_calibration directly. Or if from the windows tool, we could get all these params - http://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html and http://wiki.ros.org/image_pipeline/CameraInfo (this can be done with only camera_info)

Also it would be great if the commits could be broken down into addition of features, and the messages could be feature driven as well, instead of something like "updated to 1.3.7".

Finally, some questions:

I have these params(got from the Windows utility, I guess I messed up a little for the second and third sensor a bit)

    cu      cv  focal   baseline
161.027 126.507 230.969 0.150264
149.911 115.413 252.47  0.149167
152.967 121.818 236.998 0.150177
163.841 126.252 233.432 0.149859
165.338 123.442 231.71  0.149611

But my depth image is relatively more screwed up. (got from the usb example, running in Ubuntu)

Where am I going wrong?
dji_guidance_issue

Update: I am publishing the camera info manually and getting segfaults. :\
See comments in issue #1

I renamed topics to /guidance/right/image_raw, etc.
If I try to calibrate via ROS (http://wiki.ros.org/camera_calibration/Tutorials/StereoCalibration) using rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 right:=/guidance/right/image_raw left:=/guidance/left/image_raw right_camera:=/guidance/right left_camera:=/guidance/left --no-service-check, nothing happens.
(--no-service-check as I just want to save the calibration results, or else one uses cam_info_manager)

But if I run a monocular calib using either right or left, it runs well.

Thank you very much for the advice. We'll update the package as you suggested. Actually it would be very nice if you can do modify the code directly and create a pull request.

Ok. I ll bring in something by the end of this week.
Also, can DJI provide the calib params as well? It would be helpful to have them.
I am unable to track the segfault. It's actually happening coz of the cam info publishing I am doing manually.

(I think) The fault is happening coz of the float64 [] data type in sensor_msgs::CameraInfo (look at float64[] D (the distortion params) here: http://docs.ros.org/jade/api/sensor_msgs/html/msg/CameraInfo.html)

So it's actually a vector (which I realized as it doesn't compile unless you enforce c++11 in the CMakeLists.txt.

Now, if we look at https://github.com/dji-sdk/Guidance-SDK-ROS/blob/master/src/GuidanceNode.cpp#L142-L150 :

    sensor_msgs::LaserScan g_oa;
        g_oa.ranges.resize(5);
    g_oa.header.frame_id = "guidance";
    g_oa.header.stamp    = ros::Time::now();
    g_oa.ranges[0] = 0.01f * oa->distance[0];
    g_oa.ranges[1] = 0.01f * oa->distance[1];
    g_oa.ranges[2] = 0.01f * oa->distance[2];
    g_oa.ranges[3] = 0.01f * oa->distance[3];
    g_oa.ranges[4] = 0.01f * oa->distance[4];

which I resorted to finally, after a couple of days of trying to find the fault.

But no, it still occurs (as mentioned in another issue)

I resorted to publishing the float64[] D part of the camera_info message in this way (no loops and resizing the vectors, and tried a bunch of other ways, as well as parsing a YAML, but it just doesn't work)

Is this some 32 v/s 64 thing (as you said in the other issue) or I am doing something wrong at the basic level.

Ideally, we should use camera_info_manager. But I am doing basically the same thing. It uses camera_calibration_parser package to parse the YAML for the cam parameters. So, I am not doing that yet, and using the cpp sdk for now.

Will calibrating with a physical checkerboard give better results than the Windows utility?