stereolabs/zed-sdk

Questionable ZED accuracy?

WASCHMASCHINE opened this issue ยท 17 comments

I've noticed some depth issues with my ZEDs, which had abysmal depth errors with the factory calibration, i.e. sensing 12 m when the real depth was 25 m.

To test the depth, we used a planar and highly textured surface and moved it away from the camera 50 times (1m to 25m), saving the measurements via SVOs (720p mode). On the ZED, we mounted a small laser rangefinder which looked in the same direction. Then, I ran through every SVO file and chose the same pixel, whose ray hit the target every frame (close to the red laser dot). The measurement is the mean of all disparities during the SVO sequence, which would help the accuracy greatly. Then I used a manual calibration file to improve the results, which are plotted in the image below. I also ran ELAS on the rectified images using the rectified intrinsics to test whether your algorithm is just wrong. It does not seem that way:

test_zed

Using ULTRA mode, no self calib.

Is there any way to improve the depth accuracy? I mean the depth error as e_z = z^2/(B*f)*e_m with z being the distance, B the baseline, f the hor. focal length and e_m the disparity (matching) error of your algorithm would yield 25.0m^2/(0.12m*670px)*0.5px = 3.89m, which is on point for ELAS. But I would assume that the ZED would fluctuate around the laser ground truth and not be systematically off, i.e. biased.

What is the absolute best way to calibrate the camera?

Also, why was the factory calibration so bad?

Your charts show a quadratic decrease in Z-accuracy and a bias increase with the range, which is consistent with the stereo uncertainty model.
The systematic bias you mention is due to subpixel interpolation which is affected by pixel-locking and foreshortening phenomena. These phenomena are discussed in many papers. We are working on bias reduction techniques but it is still a work in progress.

Regarding calibration, your camera may have suffered a mechanical shock or there was an issue during its calibration process. Either way, please send to our Support team your order and serial numbers so we have a look into the issue.

For the recalibration, your charts seem to indicate that you are close to the stereo accuracy model. You can still try Kalibr which is currently the state of the art tool for calibration and could improve your triangulation results.

Hey! I got a simillar behaviour. What I did to solve this is to calibrate the camera using external ROS calibrator... but each lens separately and then using both calibrations within the opencv stereocalibration. It works much better than the calibration tool.

@wbapablo That sounds great! Do you have a link to the external ROS calibrator? I assumed that the SDK just used the stereo calibration of OpenCV, though.

Do you have some data to show the improvement, too?

This paper shows an approximation of the RMS error of the ZED camera as a function of distance.
https://ddd.uab.cat/pub/elcvia/elcvia_a2018v17n1/elcvia_a2018v17n1p1.pdf

@LuisOrtizF The link doesn't work for me, could you share the title and author name please?

I am curious what the best depth accuracy you can achieve using ZED camera at 1m - 2m distance range. 1cm?

I am curious what the best depth accuracy you can achieve using ZED camera at 1m - 2m distance range. 1cm?

This depends on the calibration (fx and B) and the quality of the stereo matching. Assuming B=0.12, fx=1400 px (2K mode) and a Gaussian disparity error with e_d = 0.5 px (1 x sigma). Plugging it in
e_z = B*f/d - (B*f)/(d + e_d) we can get (using d = B*f/z)

e_z = 3mm @ z = 1m
e_z = 12mm @ z = 2m

With worse stereo matching, e_d = 1.0 px
e_z = 6mm @ z = 1m
e_z = 23.5mm @ z = 2m

Keep in mind that e_z is one approximate standard deviation of a Gaussian and that it is only adequate for near points. So @1m with good stereo matching and perfect calibration, most measurements will be below 1 cm. But keep in mind that the stereo matching may be biased in some way.

I am curious what the best depth accuracy you can achieve using ZED camera at 1m - 2m distance range. 1cm?

This depends on the calibration (fx and B) and the quality of the stereo matching. Assuming B=0.12, fx=1400 px (2K mode) and a Gaussian disparity error withe_d = 0.5 px (1 x sigma). Plugging it in
e_z = B*f/d - (B*f)/(d + e_d) we can get (using d = B*f/z)

e_z = 3mm @ z = 1m
e_z = 12mm @ z = 2m

With worse stereo matching, e_d = 1.0 px
e_z = 6mm @ z = 1m
e_z = 23.5mm @ z = 2m

Keep in mind that e_z is one approximate standard deviation of a Gaussian and that it is only adequate for near points. So @1m with good stereo matching and perfect calibration, most measurements will be below 1 cm. But keep in mind that the stereo matching may be biased in some way.

This is very helpful. Thanks a lot!

@wbapablo That sounds great! Do you have a link to the external ROS calibrator? I assumed that the SDK just used the stereo calibration of OpenCV, though.

Do you have some data to show the improvement, too?

@WASCHMASCHINE Did ROS calibrator work for you? Did you eliminate the estimation error with manual calibration ?

@rpfly3 I did not try it out yet, so feel free to share the results! But I have prepared OpenCVs stereo calibration code to work with the ZED in the no GPU mode, still I haven't figured out the mapping of the parameters to Stereolabs calibration parameters. I did not spend a lot of time on it, though.

@WASCHMASCHINE you might find this useful :
https://github.com/stereolabs/zed-opencv-native/blob/f7e17b6368ecbc432c0ca0d64be4586e88db8799/src/calibration.cpp#L74-L175

This function takes the ZED calibration parameters and converts them to OpenCV format (the reverse of what you want).

@adujardin Thank you! Can't you push the reverse too? ๐Ÿ˜…

@WASCHMASCHINE did you figure out a better way to calibrate the camera?

Keep in mind that e_z is one approximate standard deviation of a Gaussian and that it is only adequate for near points. So @1m with good stereo matching and perfect calibration, most measurements will be below 1 cm. But keep in mind that the stereo matching may be biased in some way.

You mention good stereo matching - is this something we can control, or is this already implemented when we get the depth image from /depth/depth_registered?

@WASCHMASCHINE did you figure out a better way to calibrate the camera?

No, I haven't spend time with the ZED.

You mention good stereo matching - is this something we can control, or is this already implemented when we get the depth image from /depth/depth_registered?

This is dependent on the stereo matching algorithm, which is closed source. So you are limited by their implementation for the creation of the disparity map. This is too bad because there are newer algorithms that could outperform the current ZED SDK in not all, but certain scenery.

If we use a structured light projector in addition to the ZED system, by how much can we reduce the disparity error?