sensor-fusion
There are 586 repositories under sensor-fusion topic.
mit-han-lab/bevfusion
[ICRA'23] BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation
hku-mars/r3live
A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
kriswiner/MPU9250
Arduino sketches for MPU9250 9DoF with AHRS sensor fusion
autonomousvision/transfuser
[PAMI'23] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving; [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving
hyye/lio-mapping
Implementation of Tightly Coupled 3D Lidar Inertial Odometry and Mapping (LIO-mapping)
lucasjinreal/alfred
alfred-py: A deep learning utility library for **human**, more detail about the usage of lib to: https://zhuanlan.zhihu.com/p/341446046
hku-mars/FAST-LIVO
A Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry (LIVO).
HKUST-Aerial-Robotics/GVINS
Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.
ucla-vision/xivo
X Inertial-aided Visual Odometry
cggos/imu_x_fusion
IMU + X(GNSS, 6DoF Odom) Loosely-Coupled Fusion Localization based on ESKF, IEKF, UKF(UKF/SPKF, JUKF, SVD-UKF) and MAP
TurtleZhong/Map-based-Visual-Localization
A general framework for map-based visual localization. It contains 1) Map Generation which support traditional features or deeplearning features. 2) Hierarchical-Localizationvisual in visual(points or line) map. 3)Fusion framework with IMU, wheel odom and GPS sensors.
methylDragon/ros-sensor-fusion-tutorial
An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization! 🛰
KIT-ISAS/lili-om
LiLi-OM is a tightly-coupled, keyframe-based LiDAR-inertial odometry and mapping system for both solid-state-LiDAR and conventional LiDARs.
wvangansbeke/Sparse-Depth-Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st place on KITTI) [2019]
SpectacularAI/HybVIO
HybVIO visual-inertial odometry and SLAM system
aleksandrkim61/EagerMOT
Official code for "EagerMOT: 3D Multi-Object Tracking via Sensor Fusion" [ICRA 2021]
UMich-BipedLab/extrinsic_lidar_camera_calibration
This is a package for extrinsic calibration between a 3D LiDAR and a camera, described in paper: Improvements to Target-Based 3D LiDAR to Camera Calibration. This package is used for Cassie Blue's 3D LiDAR semantic mapping and automation.
zhujun98/sensor-fusion
Kalman filter, sensor fusion
Sollimann/CleanIt
Open-source Autonomy Software in Rust-lang using gRPC for the Roomba series robot vacuum cleaners. Under development.
aau-cns/mars_lib
MaRS: A Modular and Robust Sensor-Fusion Framework
leggedrobotics/graph_msf
A graph-based multi-sensor fusion framework. It can be used to fuse various relative or absolute measurments with IMU readings in real-time.
radar-lab/ti_mmwave_rospkg
TI mmWave radar ROS driver (with sensor fusion and hybrid)
appinho/SARosPerceptionKitti
ROS package for the Perception (Sensor Processing, Detection, Tracking and Evaluation) of the KITTI Vision Benchmark Suite
enginBozkurt/Error-State-Extended-Kalman-Filter
Vehicle State Estimation using Error-State Extended Kalman Filter
VIS4ROB-lab/HyperSLAM
Modular, open-source implementations of continuous-time simultaneous localization and mapping algorithms.
SJTU-ViSYS/Ground-Fusion
Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024)
LahiruJayasinghe/RUL-Net
Deep learning approach for estimation of Remaining Useful Life (RUL) of an engine
CAOR-MINES-ParisTech/ukfm
Unscented Kalman Filtering on (Parallelizable) Manifolds (UKF-M)
aster94/SensorFusion
A simple implementation of some complex Sensor Fusion algorithms
xingyuuchen/LIO-PPF
[IROS 2023] Fast LiDAR-Inertial Odometry via Incremental Plane Pre-Fitting and Skeleton Tracking
alexklwong/unsupervised-depth-completion-visual-inertial-odometry
Tensorflow and PyTorch implementation of Unsupervised Depth Completion from Visual Inertial Odometry (in RA-L January 2020 & ICRA 2020)
diegoavillegasg/IMU-GNSS-Lidar-sensor-fusion-using-Extended-Kalman-Filter-for-State-Estimation
State Estimation and Localization of an autonomous vehicle based on IMU (high rate), GNSS (GPS) and Lidar data with sensor fusion techniques using the Extended Kalman Filter (EKF).
aaronboda24/Loose-GNSS-IMU
Loosely coupled integration of GNSS and IMU
XikunLiu-huskit/GLIO
GLIO: Tightly-Coupled GNSS/LiDAR/IMU Integration for Continuous and Drift-free State Estimation
ser94mor/sensor-fusion
Filters: KF, EKF, UKF || Process Models: CV, CTRV || Measurement Models: Radar, Lidar
sharathsrini/Kalman-Filter-for-Sensor-Fusion
A Sensor Fusion Algorithm that can predict a State Estimate and Update if it is uncertain