robot-pesg/BotanicGarden

hi,how can i solve the problem of "IMU excitation not enouth! Not enough features or parallax; Move device around"。

Closed this issue · 4 comments

When I run that dataset with VINS-MOMO, the terminal keeps having the error message, "IMU excitation not enouth! Not enough features or parallax; Move device around",can you help me and give me some advice?
Here are my parameter settings for the yaml file。

%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam;

imu: 1

num_of_cam: 1

imu_topic: "/imu/data"
image_topic: "/dalsa_gray/left/image_raw"
output_path: "/home/zhang/output/"

distortion_parameters:
k1: -0.061606731291039
k2: 0.100129900708930
p1: 0.0
p2: 0.0
projection_parameters:
fx: 643.5951111267952
fy: 642.6656126857330
cx: 475.2215406900663
cy: 307.4120184196884
image_width: 960
image_height: 600

Extrinsic parameter between IMU and Camera.

estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
# 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

extrinsicRotation: !!opencv-matrix
rows: 3
cols: 3
dt: d
data: [-0.00375313,0.01609339,0.99986345,
-0.99997267,-0.00642959,-0.00365005,
0.00636997,-0.99984982,0.01611708]
#Translation from camera frame to imu frame, imu^T_cam
extrinsicTranslation: !!opencv-matrix
rows: 3
cols: 1
dt: d
data: [0.17517298,0.13259005, 0.0670137]

#Multiple thread support
multiple_thread: 4 # origin 1

#feature traker paprameters
max_cnt: 150 # max feature number in feature tracking
min_dist: 30 # min distance between two features
freq: 0 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0 # ransac threshold (pixel)
show_track: 1 # publish tracking image as topic
flow_back: 1 # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time origin 0.04
max_num_iterations: 8 # max solver itrations, to guarantee real time origin 8
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters The more accurate parameters you provide, the better performance
acc_n: 0.001 # accelerometer measurement noise standard deviation.
gyr_n: 0.0002 # gyroscope measurement noise standard deviation.
acc_w: 0.00002 # accelerometer bias random work noise standard deviation.
gyr_w: 0.000005 # gyroscope bias random work noise standard deviation.
g_norm: 9.81007 # gravity magnitude

#unsynchronization parameters
estimate_td: 0 # online estimate time offset between camera and imu
td: 0.0 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
loop_closure:1
load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/zhang/output/pose_gragh/" # save and load path
save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0

@NEFU-zhang Dear, this was just because the robot has not moved yet. Please wait until the robot move, or use rosbag play xx.bag -s 20 to jump 20 seconds of the rosbag. Thanks for your report.

Thanks for your reply, but my problem doesn't seem to be that the robot has not moved yet.
Because when the bag file is first played, the terminal keeps having the error message, "IMU excitation not enouth! Not enough features or parallax; Move device around", since the robot is stationary.
But When the robot starts to move, the terminal keeps the error message: Not enough features or parallax; Move device around" until the whole bag file is played, I tried all seven of your open-source sequences, all with the same problem!
And the "tracking images" on my RVIZ page are full of blue feature points with very few tracking, there are no red feature
points that track a lot throughout the entire time the bag file is playing.

选区_004
选区_005
选区_002
选区_003

@NEFU-zhang The rosbag seems working fine in our PC, as shown below in the screeshot.

I recommend you to use VINS-Fusion (mono-imu mode, this code seems better in robustness than VINS-Mono) and directly copy my config files. I already create a baidu link for you, you can download the yamls and put them in the correct folder to your VINS-Fusion project.

链接:https://pan.baidu.com/s/1LnsI5lfXgSxyWSxBYzVKhA?pwd=sjtu
提取码:sjtu
--来自百度网盘超级会员V10的分享

2024-01-02_11-57

Thanks to your advice, I successfully run VINS-Fusion on my PC!
The reason for the failure of VINS-MOMO may be the failure of front-end optical flow tracking? Or are the relevant parameters in my YAML file set incorrectly?
Looking forward to your continued updates!
选区_009