/Bayesian-Tensor-Toolbox

Collection of state-of-art Bayesian Dynamic Tensor Learning model

MIT LicenseMIT

Bayesian Tensor Toolbox (BayTT)

logo

(the repo is still under construction, some link and statistic could be wrong, we will release the full code soon)


BayTT is an open-source library collecting state-of-art models and baselines for Bayesian Tensor decomposition.

We provide a neat code base to decompose a sparse tensor in probabilistic ways, which cover three mainstream tasks now: Sparse Tensor Decomposition, Streaming Tensor Decomposition, Temporal Tensor Decomposition . We will add more topics like Functional Tensor Decomposition in the future.

For each task, we made the leader borad evaluated on several classical datasets. We also provide the dataset in the repo.

Leaderboard

Note: We will keep updating this leaderboard. If you have proposed advanced and awesome models, you can send us your paper/code link or raise a pull request. We will add them to this repo and update the leaderboard as soon as possible.

Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.

Sparse Tensor Decomposition

Model name Movielens 10K Movielens 1M ACC DBLP
🥇 1st NEST NEST NEST BASS-Tucker
🥈 2nd POND POND POND POND
🥉 3rd SparseHGP SparseHGP SparseHGP SparseHGP
  • NEST - “Nonparametric Decomposition of Sparse Tensors”, [(ICML 2021)] [Code].
  • POND - “Probabilistic Neural-Kernel Tensor Decomposition”, [ICDM 2020] [Code].
  • SparseHGP - “Nonparametric Sparse Tensor Factorization with Hierarchical Gamma Processes”, [ICML 2022] [Code]

Streaming Tensor Decomposition

Model name Movie-lens DBLP ACC demo
🥇 1st SFTL SFTL SFTL SFTL
🥈 2nd SBDT SBDT SBDT SBDT
🥉 3rd BASS-Tucker BASS-Tucker BASS-Tucker BASS-Tucker
  • BASS-Tucker - Shikai Fang, Akil Narayan, Robert Kirby, and Shandian Zhe, “Bayesian Continuous-Time Tucker Decomposition ”, The 39 International Conference on Machine Learning [(ICML 2022)] [Code].
  • SBDT - Shikai Fang, Zheng Wang, Zhimeng Pan, Ji Liu, and Shandian Zhe, “Streaming Bayesian Deep Tensor Factorization” [ICML 2021] [Code].
  • SFTL - Streaming Factor Trajectory Learning for Temporal Tensor Decomposition [NeurIPS 2023] [Code]

Temporal Tensor Decomposition

Model name Movie-lens DBLP ACC demo
🥇 1st SFTL SFTL DEMOTE SFTL
🥈 2nd NON-FAT DEMOTE NON-FAT DEMOTE
🥉 3rd BCTT BCTT NON-FAT NON-FAT
  • DEMOTE - Zheng Wang, Shikai Fang, Shibo Li, and Shandian Zhe, “Dynamic Tensor Decomposition via Neural Diffusion-Reaction Processes” [(NeurIPS 2023)] [Code].
  • NON-FAT - Zheng Wang, and Shandian Zhe, “Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition”[ICML 2022] [Code].
  • BCTT - Shikai Fang, Akil Narayan, Robert M. Kirby, and Shandian Zhe, “Bayesian Continuous-Time Tucker Decomposition”[ICML 2022] [Code]

List of Byesian Tensor Models in this Repo

Bayesian Sparse Tensor Decomposition

Name Description
SparseHGP Sparse Tensor Factorization with Hierarchical Gamma Processes demo paper origin code
NEST Non-linear decompostion based on Dirichlet processes and Gaussian processes demo paper origin code
POND Non-linear decompostion based on Deep Kernel Gaussian Process demo paper origin code
GPTF Non-linear decompostion based on Sparse Gaussian Process demo paper origin code
SVI-CP Bayesian CP decompostion with stochastic variational inference(SVI) demo paper origin code
SVI-Tucker Bayesian Tucker decompostion with stochastic variational inference(SVI) demo paper origin code
CEP-CP Bayesian CP decompostion with conditional expectation propagation(CEP) demo paper origin code
CEP-Tucker Bayesian Tucker decompostion with conditional expectation propagation(CEP) demo paper origin code

Bayesian Streaming Tensor Decomposition

Name Description
SNBDT Streaming Nonlinear decomposition with random Fourier features demo paper origin code
SBDT Streaming Deep decompostion with sparse BNN demo paper origin code
BASS-Tucker Streaming Tucker decompostion with sparse Tucker core demo paper origin code
POST Streaming CP decompostion with SVB update demo paper origin code
ADF-CP Streaming CP decompostion with ADF update demo paper origin code
ADF-Tucker Streaming Tucker decompostion with ADF update demo paper origin code

Bayesian Temporal Tensor Decomposition

Name Description
SFTL Streaming Temporal CP/Tucker with time-varing latent factors demo paper origin code
DEMOTE Temporal tensor as Diffusion-Reaction Processes on Graph demo paper origin code
BCTT Temporal Tucker decompostion with time-varing tucker core demo paper origin code
NON-FAT GP priors + Fourier Transform demo paper origin code
THIS-ODE Temporal Tensor decompostion with neuralODE demo paper origin code
CT-GPTF Streaming CP decompostion with ADF update demo paper origin code
CT-CP Streaming Tucker decompostion with ADF update demo paper origin code

Usage

  1. Install Python 3.8. For convenience, execute the following command.
pip install -r requirements.txt
  1. Prepare Data. You can obtained the well pre-processed datasets from [Google Drive], [Tsinghua Cloud] or [Baidu Drive]. Then place the downloaded data under the folder ./dataset. Here is a summary of supported datasets.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{fang2022bayesian,
  title={Bayesian Continuous-Time Tucker Decomposition},
  author={Fang, Shikai and Narayan, Akil and Kirby, Robert and Zhe, Shandian},
  booktitle={International Conference on Machine Learning},
  pages={6235--6245},
  year={2022},
  organization={PMLR}
}