errors with own datasets
YushengWHU opened this issue · 5 comments
Hello here, I tried to evaluate ur tightly coupled system with my loosely coupled algorithm, but the back-end optimization process of ur algorithm always crashes after a while, some times only a couple of seconds, with the output:
I chose several scenarios for comparison with ur algorithms:
The first is the common school scene, where the map output crashed whenever my robot start moving, which seems to be wrong coordinates setup, (I also encountered the same situation when testing LIO-SAM) however, I have exactly the same coordinate definition in my second scenario, which seems fine at first.
Figure 1. Everything seem fine at first in school scene
Figure 2. Whenever the robot starts moving, the back end fusion crashes
The second is the railway scene, the back-end optimization seems to work longer now, but it still keeps crashing from time to time.
Hardware setup:
MTI680g(forward-left-up), livox horizon(forward-left-up).
I only used mti680G IMU for preprocessing and backend fusion.
Do you have any suggestions?
By the way, everything is fine with ur datasets, and my algorithm can work on ur datasets without any modification, which is to say, we have the same coordinate definition.
I don't really get what you mean. You seemed to be using the backend of lili-om with your loosely coupled as the frontend? Or just wanted to compare the two systems? In general, those parameters should be retuned depending on your own sensors, motion and the deployed surroundings. Also, you may need to take a look at orientation initialization of your own system. We tested lili-om also live with pedestrian-like motion and urban scenario and the system worked fine. Without much more insight into your own system setup we are not able to give specific solution for deploying lili-om on your own platform.
I don't really get what you mean. You seemed to be using the backend of lili-om with your loosely coupled as the frontend? Or just wanted to compare the two systems? In general, those parameters should be retuned depending on your own sensors, motion and the deployed surroundings. Also, you may need to take a look at orientation initialization of your own system. We tested lili-om also live with pedestrian-like motion and urban scenario and the system worked fine. Without much more insight into your own system setup we are not able to give specific solution for deploying lili-om on your own platform.
Actually, I want to compare the performance of your algorithm with my own, my algorithm works fine on your datasets, which is to say, we have the same hardware coordinates definition. The problem is, when I apply your algorithm on my own datasets, the back end fusion of your algorithm always crashes after a while, for better description, below is two example of my testing scenes.
I used multiple livox horizons for my algorithm (5-8) and I only choose the front one for applying your algorithm.
Figure1. School scene
Figure 2. railway highspeed scene
The scene should be applicable for lili-om. However, you have a different sensor setup which definitely requires retuning of the parameters. The system is well deployable for a different hardware setup or application scenario with proper parameter tuning and sometimes necessary customization (e.g., the one in the pinned issue).
The problem here is the tiny difference between MTI 680G and MTI670, although they have the same coordinate definition, the output is slightly different.
As you can see, the X and Y coordinates of MTI 670 is the opposite to livox IMU, while the MTI 680 has the same trend with livox IMU.
After changing angular velocity and linear acceleration x and y to negative, everything works fine.