caddyserver/certmagic

How to extend cache to behave as LRU Cache

OrkhanAlikhanov opened this issue ยท 5 comments

What is your question?

How to extend cache to behave as LRU Cache. Right now it removes randomly when capacity is reached. Any directions are welcome.

mholt commented

Can be done but requires some more state. Is this impacting your deployments?

Thanks for your quick reply. We are in the early stage of moving away from wildcard certificates. So nothing is impacted. We want to remove less frequently used certs first as it makes more sense in our case. Consider a website like statuspage.io where it gives you yourpage.statuspage.io or netlify/vercel preview deployments like example-somehash.vercel.app. You can have hundreds of domains but some of them are visited only once while others multiple times in a day. So randomized removal may remove the most used one which is not ideal.

mholt commented

I see. Great use case, but I think first we should make a decision like this based on data/experience.

Why do you want LRU specifically? Why does it "make more sense"?

So randomized removal may remove the most used one which is not ideal.

It also may remove the least-used one, which is most ideal.

There are a lot of cache eviction options, so if we're going to invest in the complexity of something other than random we might as well make sure it serves us well. ๐Ÿ‘

Let's get some data from an actual deployment to see why random isn't a good fit, and to have it become more clear that another algorithm would be better.

For example, LRU has a bad worst case, if the cache is slightly smaller than the working set.

Thanks for your thoughts.

If someone stumbles upon this, https://en.wikipedia.org/wiki/Cache_replacement_policies might be a useful resource to checkout.

mholt commented

Closing, since there's nothing actionable at this point; but if we get some useful data as mentioned above, then we can reopen and discuss which policy to use if random is not sufficient.