Action based rate limiting
rmarronnier opened this issue · 2 comments
To protect a web app against abusive connections / requests (sub DDOS scale) several layers can help (nginx, middleware), but stumbling on Rails future rate limiting , I'm quite jealous of the elegant DX of this solution.
I could clearly see this working in Lucky :
class SignIns::New < BrowserAction
include Auth::RedirectSignedInUsers
rate_limit to: 50, within: 10.seconds
get "/sign_in" do
html NewPage, operation: SignInUser.new
end
end
I can't implement this right now, I'm just putting it out there as a starting point for discussion / inspiration.
Yeah, I dig it. I think rails has an easy way to handle it with having their key/value store built-in now. I wonder how this would scale with postgres backed. Or would this feature require redis to be added in?
The best solution would be to rely on https://github.com/luckyframework/lucky_cache and let the user configure their preferred backend (redis, db, etc). That's what Rails ended up doing : rails/rails#50781
It would require implementing new store(s) for lucky_cache (redis for example). Caching for Avram might also benefit from it.
With a redis store added to lucky_cache, a dependency on a redis shard would be added (not an issue for redis-less users if the redis store is not used).
Again, I can't do the work right now, just putting down ideas :-)