Differentiable computations of the signature and logsignature transforms, on both CPU and GPU.
The signature transform is roughly analogous to the Fourier transform, in that it operates on a stream of data (often a time series). Whilst the Fourier transform extracts information about frequency, the signature transform extracts information about order and area. Furthermore (and unlike the Fourier transform), order and area represent all possible nonlinear effects: the signature transform is a universal nonlinearity, meaning that every continuous function of the input stream may be approximated arbitrary well by a linear function of its signature. If you're doing machine learning then you probably understand why this is such a desirable property!
Besides this, the signature transform has many other nice properties -- robustness to missing or irregularly sampled data; optional translation invariance; optional sampling invariance. Furthermore it can be used to encode certain physical quantities, and may be used for data compression.
Check out this for a primer on the use of the signature transform in machine learning, just as a feature transformation, and this for a more in-depth look at integrating the signature transform into neural networks.
pip install signatory==<SIGNATORY_VERSION>.<TORCH_VERSION> --no-cache-dir --force-reinstall
where <SIGNATORY_VERSION>
is the version of Signatory you would like to download (the most recent version is 1.2.1) and <TORCH_VERSION>
is the version of PyTorch you are using.
Available for Python 2.7, 3.5, 3.6, 3.7, 3.8 and Linux, Mac, Windows. Requires PyTorch 1.2.0, 1.3.0, 1.3.1, 1.4.0 or 1.5.0.
After installation, just import signatory
inside Python.
Take care not to run pip install signatory
, as this will likely download the wrong version.
For example, if you are using PyTorch 1.3.0 and want Signatory 1.1.4, then you should run:
pip install signatory==1.1.4.1.3.0 --no-cache-dir --force-reinstall
Yes, this looks a bit odd. This is needed to work around limitations of PyTorch and pip.
The --no-cache-dir --force-reinstall
flags are because pip
doesn't expect to need to care about versions quite as much as this, so it will sometimes erroneously use inappropriate caches if not told otherwise.
Installation from source is also possible; please consult the documentation. This also includes information on how to run the tests and benchmarks.
If you have any problems with installation then check the FAQ. If that doesn't help then feel free to open an issue.
The documentation is available here.
Usage is straightforward. As a simple example,
import signatory
import torch
batch, stream, channels = 1, 10, 2
depth = 4
path = torch.rand(batch, stream, channels)
signature = signatory.signature(path, depth)
# signature is a PyTorch tensor
For further examples, see the documentation.
If you found this library useful in your research, please consider citing the paper.
@article{signatory,
title={{Signatory: differentiable computations of the signature and logsignature transforms, on both CPU and GPU}},
author={Kidger, Patrick and Lyons, Terry},
journal={arXiv:2001.00706},
url={https://github.com/patrick-kidger/signatory},
year={2020}
}