/SensorFusionND

Udacity: Sensor Fusion Nanodegree

Primary LanguageC++

Udacity: Sensor Fusion Nanodegree

Projects:

  • P1: Lidar Obstacle Detection

    This project involved processing point cloud data using the C++ library, PCL. The data was first segemented using a linear Ransac model to determine which parts were part of the road, and which were not. KD-Trees were then used to conduct nearest neigbour search for clustering. A bounding box could then be drawn around the points - showing the location of vehicles.

  • P2: Feature Tracking

    The OpenCV library was used to conduct 2D feature tracking using a variety of keypoint detectors and descriptors. The descriptors were used to conduct keypoint matching from one video frame to the next - ultimately this could be used to calculate the time-to-collision for an autonomous vehicle.

  • P3: Object Tracking

    The object tracking project utilised lidar data in conjunction with the 2D feature tracking in order to match the bounding boxes for vehicles and provide a much more representative estimate of the time-to-collision.

  • P4: Radar Target Detection

    MATLAB code was developed to detect targets using a FMCW radar module. A FFT was applied to the radar data in order to produce a Range-Doppler estimation - CFAR was then applied to remove noise and gain an accurate signal. Once a clean Range-Doppler map had been acquired then clustering could be applied to track an individual object.

  • P5: Unscented Kalman Filter

    The aim of this project was to combine the acquired sensor data for use in a unified estimation of the state of a vehicle. An Unscented Kalman Filter was developed which took lidar and radar readings in order to simultaneously predict the state of multiple vehicles along a simulated road.

Author: Daniel Kelshaw