This paper presents a novel design of a soft tactile finger with omni-directional adaptation using multi-channel optical fibers for rigid-soft interactive grasping. Machine learning methods are used to train a model for real-time prediction of force, torque, and contact using the tactile data collected. We further integrated such fingers in a reconfigurable gripper design with three fingers so that the finger arrangement can be actively adjusted in real-time based on the tactile data collected during grasping, achieving the process of rigid-soft interactive grasping. Detailed sensor calibration and experimental results are also included to further validate the proposed design for enhanced grasping robustness.
$ git clone https://github.com/yanglh14/InteractiveGrasping.git
$ python -m pip3 install --user virtualenv
$ source my_env/bin/activate
_ data_processing _ : process data collected and train the data with machine learning method.
_ visulization _ : visulize the data processing and experiment results.
_ DataCollection _ : collect data for calibration.
_ InteractiveGrasping _ : implement rigid-soft interactive grasping.
Should you have some questions regarding the implement of our soft tactile sensor, please contact Yang Linhan at yanglh14@163.com .