This library contains a Tensorflow implementation of the Laten Fact Model and Latent Information model for Gaussian and Von-Mises Fisher latent priors, using the re-parametrisation trick to learn the distributional parameters. The VMF re-parametrisation trick is as presented in [1](http://arxiv.org/abs/1804.00891). Check out the authors of VMF blogpost (https://nicola-decao.github.io/s-vae). The Gaussian re-parametrisation trick is a Tensorflow probability function.
- python>=3.6
- *tf-nightly: https://tensorflow.org
- *tfp-nightly: https://www.tensorflow.org/probability/
- scipy: https://scipy.org
To install, run
$ python setup.py install
- Alexander Cowen-Rivers (GitHub)
For:
- Models see Latent Fact Model and Latent Information Model
- Paper see ACR
Train variational knowledge graph model, on nations dataset with normal prior using DistMult scoring function :
python main_LIM.py --no_batches 10 --epsilon 1e-07 --embedding_size 50 --dataset nations --alt_prior False --lr 0.001 --score_func DistMult --negsamples 5 --projection False --distribution normal --file_name /User --s_o False
- Clone or download this repository.
- Prepare your data, or use any of the six included KG datasets.
Please cite [1] and [2] in your work when using this library in your experiments.
For questions and comments, feel free to contact ACR(mailto:mc_rivers@icloud.com).
MIT
[1] Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T.,
and Tomczak, J. M. (2018). Hyperspherical Variational
Auto-Encoders. arXiv preprint arXiv:1804.00891.
BibTeX format:
@article{s-vae18,
title={Hyperspherical Variational Auto-Encoders},
author={Davidson, Tim R. and
Falorsi, Luca and
De Cao, Nicola and
Kipf, Thomas and
Tomczak, Jakub M.},
journal={arXiv preprint arXiv:1804.00891},
year={2018}
}