sentriz/gonic

Extremely high memory usage when running an incremental scan on 200k+ files

Closed this issue · 5 comments

gonic version: from source, 88e58c0

if from docker, docker tag:
if from source, git tag/branch: master

When running an incremental scan over 200000 files, memory usage ramps up to 5 gb or more.

I am running Arch Linux on a Raspberry Pi 4, using an sqlite database which is about 60 MB in size when the scan starts.

My music directory has 6 symlinks, which all point to various categories/genres.

The scan goes well until memory usage hits about 3-3.5 GB, then it starts increasing by about 100 MB a second.

At this point the scanned directory entries as per the log are printed significantly slower.

woah that's crazy. was this always the case? if not, maybe you could try git bisect to find the commit which introduced this?

No, it looks like this is fairly recent, I'd say the last 2-3 months.

Unfortunately I don't have much time these days, but I'll do a bisect when I can get to it, thanks for the tip. It might take me a while though.

thanks! also I just pushed a commit which may help

Huge thanks for this.

With 7cd1bee, memory usage stays around 100 MB during a scan and everything works as expected.

thanks for reporting the issue 👌