A Redis module for serving tensors and executing deep learning models. Expect changes in the API and internals.
If you want to run examples, make sure you have git-lfs installed when you clone.
To quickly tryout RedisAI, launch an instance using docker:
docker run -p 6379:6379 -it --rm redisai/redisai
On the client, load the model
redis-cli -x AI.MODELSET foo TF CPU INPUTS a b OUTPUTS c < examples/models/graph.pb
Then create the input tensors, run the computation graph and get the output tensor (see load_model.sh
). Note the signatures:
AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN]
AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ...
redis-cli
> AI.TENSORSET bar FLOAT 2 VALUES 2 3
> AI.TENSORSET baz FLOAT 2 VALUES 2 3
> AI.MODELRUN foo INPUTS bar baz OUTPUTS jez
> AI.TENSORGET jez VALUES
1) FLOAT
2) 1) (integer) 2
3) 1) "4"
2) "9"
This will checkout and build and download the libraries for the backends (TensorFlow, PyTorch, ONNXRuntime) for your platform. Note that this requires CUDA to be installed.
bash get_deps.sh
Alternatively, run the following to only fetch the CPU-only backends.
bash get_deps.sh cpu
Once the dependencies are downloaded, build the module itself. Note that CMake 3.0 or higher is required.
mkdir build
cd build
cmake -DDEPS_PATH=../deps/install ..
make
cd ..
You will need a redis-server version 4.0.9 or greater. This should be available in most recent distributions:
redis-server --version
Redis server v=4.0.9 sha=00000000:0 malloc=libc bits=64 build=c49f4faf7c3c647a
To start redis with the RedisAI module loaded:
redis-server --loadmodule build/redisai.so
Some languages have client libraries that provide support for RedisAI's commands:
Project | Language | License | Author | URL |
---|---|---|---|---|
JRedisAI | Java | BSD-3 | RedisLabs | Github |
redisai-py | Python | BSD-3 | RedisLabs | Github |
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow) and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
RedisAI | PyTorch | TensorFlow | ONNXRuntime |
---|---|---|---|
0.1.0 | 1.0.1 | 1.12.0 | None |
0.2.1 | 1.0.1 | 1.12.0 | None |
0.3.1 | 1.1 | 1.12.0 | 0.4.0 |
Read the docs at redisai.io. Checkout our showcase repo for a lot of examples written using different client libraries.
Redis Source Available License Agreement - see LICENSE
Copyright 2019, Tensorwerk Inc & Redis Labs Ltd