The asyncio cache that implements multiple backends.
This library aims for simplicity over specialization. All caches contain the same minimum interface which consists on the following functions:
add
: Only adds key/value if key does not exist.get
: Retrieve value identified by key.set
: Sets key/value.multi_get
: Retrieves multiple key/values.multi_set
: Sets multiple key/values.exists
: Returns True if key exists False otherwise.increment
: Increment the value stored in the given key.delete
: Deletes key and returns number of deleted items.clear
: Clears the items stored.raw
: Executes the specified command using the underlying client.
Do pip install aiocache
.
Using a cache is as simple as
>>> import asyncio
>>> loop = asyncio.get_event_loop()
>>> from aiocache import SimpleMemoryCache
>>> cache = SimpleMemoryCache()
>>> loop.run_until_complete(cache.set('key', 'value'))
True
>>> loop.run_until_complete(cache.get('key'))
'value'
You can also setup cache aliases like in Django settings:
import asyncio
from aiocache import settings, caches, SimpleMemoryCache, RedisCache
from aiocache.serializers import DefaultSerializer, PickleSerializer
# You can use either classes or strings for referencing classes
settings.set_config({
'default': {
'cache': "aiocache.SimpleMemoryCache",
'serializer': {
'class': "aiocache.serializers.DefaultSerializer"
}
},
'redis_alt': {
'cache': "aiocache.RedisCache",
'endpoint': "127.0.0.1",
'port': 6379,
'timeout': 1,
'serializer': {
'class': "aiocache.serializers.PickleSerializer"
},
'plugins': [
{'class': "aiocache.plugins.HitMissRatioPlugin"},
{'class': "aiocache.plugins.TimingPlugin"}
]
}
})
async def default_cache():
cache = caches['default'] # This always returns the same instance
await cache.set("key", "value")
assert await cache.get("key") == "value"
assert isinstance(cache, SimpleMemoryCache)
assert isinstance(cache.serializer, DefaultSerializer)
async def alt_cache():
cache = caches['redis_alt'] # This always returns the same instance
await cache.set("key", "value")
assert await cache.get("key") == "value"
assert isinstance(cache, RedisCache)
assert isinstance(cache.serializer, PickleSerializer)
assert len(cache.plugins) == 2
assert cache.endpoint == "127.0.0.1"
assert cache.timeout == 1
assert cache.port == 6379
def test_alias():
loop = asyncio.get_event_loop()
loop.run_until_complete(default_cache())
loop.run_until_complete(alt_cache())
loop.run_until_complete(RedisCache().delete("key"))
if __name__ == "__main__":
test_alias()
In examples folder you can check different use cases:
- Integrations with frameworks like Sanic, Aiohttp and Tornado
- Storing a python object in Redis
- Creating a custom serializer class that compresses data
- TimingPlugin and HitMissRatioPlugin demos
- Using marshmallow as a serializer
- Using cached decorator.
- Using multi_cached decorator.
- Configuring cache class default args
- Simple LRU plugin for memory
Aiocache provides 3 main entities:
- backends: Allow you specify which backend you want to use for your cache. Currently supporting: SimpleMemoryCache, RedisCache using aioredis and MemCache using aiomcache.
- serializers: Serialize and deserialize the data between your code and the backends. This allows you to save any Python object into your cache. Currently supporting: DefaultSerializer, PickleSerializer, JsonSerializer.
- plugins: Implement a hooks system that allows to execute extra behavior before and after of each command.
If you are missing an implementation of backend, serializer or plugin you think it could be interesting for the package, do not hesitate to open a new issue.
Those 3 entities combine during some of the cache operations to apply the desired command (backend), data transformation (serializer) and pre/post hooks (plugins). To have a better vision of what happens, here you can check how set
function works in aiocache
: