Ability to delete data, alternatively setting a max window size and automatically trim oldest data
terjew opened this issue ยท 12 comments
As the title says, I need some way of limiting the size of the dataset as the application keeps running. If I speed up the generation of data points, the application gets sluggish over time, presumably as more and more time is spent growing and copying the data buffer to make room for new data.
Is it possible to add support for trimming the beginning of the buffer, either on-demand or by setting a maximum buffer size? Perhaps the library could have an option where it uses a circular (ring) buffer for the data so no copying is needed?
Yes, I definitely want to support this. But I'm not sure about the API design. My current thought is to override the prototype of the data array, and hook into Array.prototype.splice()
, Array.prototype.shift()
and some other methods. Then I track the changes and sync them to GPU.
So to trim the oldest data, the users would write:
const data = [...];
const chart = new TimeChart(el, {
series: [{ data }],
});
// trim 10 oldest datapoint
data.splice(0, 10);
chart.update();
this would be consistent with the current data.push(...)
design, and is generic to also cover the prepending data use case (#8). The downside may include that I cannot support all array operations, which might cause confusion. And the rendering would be strange if the data is accidentally changed untracked.
How do you like this design?
This sounds like exactly what I need. Splicing the first x elements and then pushing the same amount of new entries is what the update loop would typically do anyway, so I think those two operations are the most relevant.
Instead of hooking the operations on the user-created array, I guess the time chart library could have its own data buffer class exposing only the supported operations.
@huww98 let me know if you have a test version of this that you would like me to try. I'm eager to see if this can make the library a viable option for my project, as far as I can see this is the only remaining showstopper.
@terjew Take a look at https://github.com/huww98/TimeChart/tree/wip-dynamic-data
The demo should work. But I have not carefully checked this feature would work in all cases.
I finally had some time to try this properly for my project, and the good news is that it does indeed work. The bad news is that after pushing around 130k points (131073 seems to be the magic number) to the array, the rendering stops. The following error is observed in the javascript console:
This problem is also observed in the regular demo, when using the wip-dynamic-data branch. Just leave it running until it has produced 132k points and notice that the lines stop updating towards the right hand side.
Could this somehow be related to using a 17 bit counter somewhere? 2^17 is indeed 131072, one short of the magic number causing problems for me.
I allocate that much GPU memory at first. Then allocate a second chunk of memory if the first one is not large enough. There might be some bugs when dealing with more than one chunk of data.
You should be able to reproduce the bug simply by running the regular demo on the wip-branch and leave it running for around 2 minutes.
@terjew v1.0.0-beta6 is out. Please give it a try.
Also, could you help review the new doc about this? https://github.com/huww98/TimeChart#dynamic-data
@huww98 I finally had some time to try the latest beta, and I can confirm that it fixes the issue with rendering after 2^17 points added, awesome!
The documentation also looks good as far as I can see. My only issue at the moment is that the performance of the splice operation is quite bad. If I do a splice every frame as I add new points, the performance of my application with 3 dynamically updated charts falls from a stable 70 FPS (on a 70 Hz screen) to less than 30 FPS. If I instead do the splice only every once in a while, when the amount of points exceeds buffersize + tolerance I get a nice stable 70 FPS for the most part, but still with noticeable pauses whenever the tolerance is reached.
That needs more investigation. If it is the array splice method from the browser that causes the pause, we may need to implement our own ring buffer.
Anyway, we are able to delete data now.