Robot URDF calibration using touch probes and reference geometry. Is it possible?
Opened this issue · 1 comments
Hi, interested in calibrating robot URDF (relative frame translations and rotations) + joint offsets, I would like to ask if this package could work with calibration protocols that use a 6-DoF robot with:
- a touch probe and a datum cube [example video], or
- a single or multiple dial indicators and reference spheres [example video 1], [example video 2]
If you know use cases or steps to set this up, please share.
I understand that the optimization task in these videos is to fine-tune relative URDF joint transformations and offsets to ensure that in case of datum cube the forward kinematics produces 5 planes that make an accurate cube and have no outliers.
Or, in the second video, that a set of sphere center measurements for each sphere converges to a single center point in the world frame (rather than a collection of points when not calibrated).
If there are other tools in ROS ecosystem that serve this purpose better, please don't hesitate to suggest them.
Thank you for your kind support and consideration.
Jakub
I've not really seen this sort of calibration anywhere in the ROS ecosystem.
In theory, robot_calibration could be expanded to do this sort of task (at least the touch probe and cube version) - but it doesn't exist out of the box. We do have error cost functions for things like aligning points to a plane, and so you could model the cube as a series of planes and then you would have a series of points in 3d space from the reprojection of the arm and touch probe - but actually moving the arm to a desired pose and stopping based on the probe would be something new (it's also entirely possible that there would be some unexpected oddities to resolve when constructing so many point-to-plane error blocks with a single point for each).