This Python script implements a particle filter-based hand gesture tracking algorithm using OpenCV. It tracks a face in a video stream by predicting and updating the state of particles representing potential face positions.
-
Particle Filter Implementation
- Uses a particle filter for dynamic state estimation in a video stream.
- Handles face tracking through particle representation.
-
Real-Time Tracking
- Performs real-time tracking of a face with bounding box visualization.
-
User Interaction
- Allows manual selection of the initial region of interest (ROI) around the face.
-
Robust and Adaptable
- Capable of adapting to occlusions and varying face appearances.
- Run the script.
- Select the face ROI in the first frame of the video.
- Press 'Enter' to start tracking.
- Press 'q' to quit.
- Higher computational cost with increased number of particles.
- Sensitive to initialization and parameter settings.
- Primarily focused on face tracking, limiting its use in broader contexts.