Question about buffer management
Closed this issue · 1 comments
First of all, thank you for this very well built and documented implementation. I'm in the process of creating something similar in python and was hoping you could clarify some doubts.
I am running color magnification using my webcam (30fps), with an image buffer size of 999.
As expected, the capture thread processes the frames at 30 fps, while the processing thread processes them at a much lower rate, thus leading to the image buffer getting full.
When the buffer is full, it would be expected that the processing thread is processing the frame captured 999 frames ago, and thus a big delay should be noticeable on the processed image in the gui, but that does not happen.
Although we can see that the processed video frame rate is low (due to the low processing rate), the frame delay is much lower than the expected 999 frames delay. Although this is desirable, I'm not sure what in the code causes it to happen.
From my understanding, in lines 102 and 108 of ProcessingThread.cpp
, the oldest frame present on the sharedImageBuffer
is added to the ProcessingBuffer
, which should lead to a delay proportional to the number of (unprocessed) frames in the buffer.
What causes this behaviour?
Thank you
Hey, the code depending the buffer is pretty old and from another repo (as mentioned in the README)