Create a Dockerized REST API that exposes a single rate-limited endpoint /limit. No single user should be able to hit the endpoint more than N requests per second. You can identify a user by their API key in the X-API-TOKEN in the request header. If there is no API key, then you should immediately reject the request.
docker run -p 3001:3001 diegomadness/limit &
Original task requests limit was 10 per second, but I've set it to 2 so you can have easier time hitting the limit if the server is deployed far away from you.
This exact task was already completed by me in PHP+Nginx+Redis stack. I've decided to spend less than 6 hours on this solution and make code to be as simple as possible. What I am trying to show with this solution:
- I can complete the task using Golang
- Deploy process is very simple
- I use Dockerfile for app deployment
git clone https://github.com/diegomadness/limit
cd limit
docker build -t diegomadness/limit .
docker run -p 3001:3001 diegomadness/limit &
You can repeat step 4 as much times as you have ports.
To stop this nonsense see docker ps
for running container ID and kill it
typing docker stop %container-id%
Assuming this service is going to be a part of Kubernetes deployment, Kubernetes
load balancer can take care of this challenge. To improve the app performance,
fasthttp
package can be used instead of net/http
to provide the server
functionality.
I would also consider replacing go-cache
with standalone Redis or Memcached
deployment to increase stability and performance.
Any requests over the limit can easily be tracked by service like Sentry. I can also have Prometheus connecting to the application every now and then to get a snapshot of cached requests statistics.