Default Eviction policy
Closed this issue · 3 comments
I wonder what's the default eviction policy? With a relatively small cache size (150), it seems like we are experiencing consistent eviction of the recently inserted item🤔.
Items are evicted when either quick list or full eviction runs - either removes expired items from the cache.
In addition, when you specify a desired total cache item limit, upon trying to add a new item, if at or over capacity, it will trim the existing items to make space. This will trim at least 5% of the cache, no less than 1 item and no more than 512 to ensure that the write is executed within reasonable latency.
https://github.com/neon-sunset/fast-cache/blob/main/src/FastCache.Cached/CacheManager.cs#L132-L164
Keep in mind that the following code may not perform as expected:
https://github.com/Flow-Launcher/Flow.Launcher/blob/106760ca66f25a8556658f0d8681fa48bdd4f631/Flow.Launcher.Infrastructure/Image/ImageCache.cs#L8
set
{
if (Cached<ImageUsage>.TryGet((path, isFullImage), out var cached))
{
cached.Value.imageSource = value;
cached.Value.usage++;
}
Cached<ImageUsage>.Save((path, isFullImage), new ImageUsage(0, value), TimeSpan.MaxValue,
MaxCached);
}
The cache item is being unconditionally overwritten by .Save
call. I assume the intended logic looks as follows:
if (Cached<ImageUsage>.TryGet((path, isFullImage), out var cached))
{
cached.Value.imageSource = value;
Interlocked.Increment(ref cached.Value.usage);
return;
}
return cached.Save(
new ImageUsage(0, value),
TimeSpan.MaxValue,
MaxCached);
The purpose is not 100% clear to me and you may want to double-check how the cache is used in this case.
You may also want to use Interlocked
to increment the usage count in getter too, as it is otherwise not thread-safe increment (because addition operation works as read-modify-write, so two concurrent increments may overwrite each other otherwise). If the usage count only serves the estimation purpose, then you may ignore this.
There also seem to be quite a few unneeded transient allocations in ImageLoader
. FastCache
has CachedRange<V>
that can consume IEnumerable<(string, bool), ImageUsage>
directly.
Overall, I'm not sure if FastCache
is the right tool for the given use-case. ImageCache
hides the fact that it is static (multiple instances of ImageCache
will observe the same static instance of Cached<(string, bool), ImageUsage>
, and FastCache
is definitely an overkill for just 150 items - it was written for lean but highly loaded microservices with a degree of global mutable state. For more complex lifetime and eviction policies as well as hit tracking, there are other cache libraries. The difference in a desktop application will be within noise for the given item count.
The launcher does look nice though 😀
Thanks for the wonderful reply. I am sorry for the late response as I miss this message. I will double check about the code I wrote.
On the other hand, I agree with you that it is a totally overkill to use FastCache
and I think we should migrate to a library which can support a more complex eviction policy (like lfu).
I think the eviction policy of FastCache
depends on the enumerator. However, that seems to consistently evict most recently inserted item (maybe because our limit is small). I do feel like cache hit rate is an important metric a good cache library beyond speed.