A project to synchronize and align camera and radar recordings and display data visually. Associated with the Centre of Excellence for Artificial Intelligence and Smart Mobility.
Clone the repository and install dependencies
git clone https://github.com/hfyxin/fare-gate-visual-game.git
# inside folder
python3 -m pip install -r requirements.txt
Add the following files in the specified directories
/data/Controlled_test.avi
: Controlled test video file. Download from SharePoint Here/data/frames/*
: Add frames with timestamp filenames. Download most recent test outputs from Sharepoint Here
Refer to descriptions within config.yaml for configuration parameters. The default video config is optimized for video Controlled_test.avi
with corresponding Control_test1.json
radar data.
The main visualization can be run in video mode or frame mode.
Receives footage from video file. Ensure mode = "video_mode"
at start of code in rta2_cv2visual.py:
python rta2_cv2visual.py
The openCV GUI contains trackbars which can be modified during playback for accurate adjustments.
Receives footage from frame images. Ensure mode = "frame_mode"
at start of code in rta2_cv2visual.py:
python rta2_cv2visual.py
After running the visualization, trackbar values for gate area can be saved to YAML config based on response to prompt.
Additional visualization tools for analysis purposes can be used
An animation of radar points in 3D scatter plot format (standard visual transforms to 2D perspective) with additional data.
Additional requirements: Make sure you have installed FFmpeg.
python tools/radar_visualization.py
Animation can be saved based on response to prompt at the end of animation.
Generate single scatter plot of all radar points, color coded by TLV type.
python tools/tlv_scatter.py
To run tests, run as module from main directory
python -m tests.<script-name>
- Merge with MQTT repository for live visualization
- YOLO fusion tracking addition