# The Redis proxy ## Requirement For the assignment the requirements are listed below ### Web Service * There is a web server listening port configured by config file: resources/config.properties * The web server only takes query parameters as this is a toy project * for example ?key={....}&value={...} for POST (write) * And ?key={...} for GET (read) ### Backend with Redis * The proxy is configured to talk to redis through the config.properties file mentioned above ### Local Cache: TTLLRUCache: (aka: TTL + LRU cache) * Use specified key to store cache * Global expiry time specified on the global configuration * Capacity is specified on the global configuration in terms of the number of keys * LRU eviction and TTL expiry for the cache keys * The implementation is based on Twitter's storehaus's TTL and LRU cache combined ### Global Configuration * Address and port of the backend Redis * Expiry time * Capacity * Port number of listening * Max concurrency ### Concurrency * Client should be able to connect concurrently * Internally requests are able to process sequentially * The implementation here utilize finagle libraries' maxConcurrentRequests. ### Tests * End to end system function tests * The system tests need a backing redis server as there is no suitable embedded redis server implementation * Proxy should connect to a running Redis instance ### How to local basic test * Start a local redis service, `redis-server /usr/local/etc/redis.conf` * Start this redis-proxy service `./sbt 'runMain com.github.bpin.Main'` * Use a curl command: * GET: curl "localhost:9090/?key=joe" * POST: curl -X POST -D - "localhost:9090/?key=name&value=brian"