lin-shuyu
A PhD student working to decipher the secret recipe of artificial intelligence.
University of OxfordUK
Pinned Repositories
aimsMiniProject1
AttSets
AttSets in Tensorflow (IJCV 2019)
botorch
Bayesian optimization in PyTorch
floorplan
This repository contains a collection of floor plans where we will carry out experiments. The floor plan contains scale, heading away from North and wifi router positions.
GitExampleXC9
GPR-for-field-map
First AIMS mini-project
Hello-world
Just another repository. For get things started practice
ladder-latent-data-distribution-modelling
In this paper, we show that the performance of a learnt generative model is closely related to the model's ability to accurately represent the inferred \textbf{latent data distribution}, i.e. its topology and structural properties. We propose LaDDer to achieve accurate modelling of the latent data distribution in a variational autoencoder framework and to facilitate better representation learning. The central idea of LaDDer is a meta-embedding concept, which uses multiple VAE models to learn an embedding of the embeddings, forming a ladder of encodings. We use a non-parametric mixture as the hyper prior for the innermost VAE and learn all the parameters in a unified variational framework. From extensive experiments, we show that our LaDDer model is able to accurately estimate complex latent distribution and results in improvement in the representation quality.
models
Models and examples built with TensorFlow
VAE-LSTM-for-anomaly-detection
We propose a VAE-LSTM model as an unsupervised learning approach for anomaly detection in time series.
lin-shuyu's Repositories
lin-shuyu/VAE-LSTM-for-anomaly-detection
We propose a VAE-LSTM model as an unsupervised learning approach for anomaly detection in time series.
lin-shuyu/ladder-latent-data-distribution-modelling
In this paper, we show that the performance of a learnt generative model is closely related to the model's ability to accurately represent the inferred \textbf{latent data distribution}, i.e. its topology and structural properties. We propose LaDDer to achieve accurate modelling of the latent data distribution in a variational autoencoder framework and to facilitate better representation learning. The central idea of LaDDer is a meta-embedding concept, which uses multiple VAE models to learn an embedding of the embeddings, forming a ladder of encodings. We use a non-parametric mixture as the hyper prior for the innermost VAE and learn all the parameters in a unified variational framework. From extensive experiments, we show that our LaDDer model is able to accurately estimate complex latent distribution and results in improvement in the representation quality.
lin-shuyu/aimsMiniProject1
lin-shuyu/AttSets
AttSets in Tensorflow (IJCV 2019)
lin-shuyu/botorch
Bayesian optimization in PyTorch
lin-shuyu/floorplan
This repository contains a collection of floor plans where we will carry out experiments. The floor plan contains scale, heading away from North and wifi router positions.
lin-shuyu/GitExampleXC9
lin-shuyu/GPR-for-field-map
First AIMS mini-project
lin-shuyu/Hello-world
Just another repository. For get things started practice
lin-shuyu/models
Models and examples built with TensorFlow
lin-shuyu/modified-wae
added my WiSE-ALE encoder-decoder architecture to WAE code
lin-shuyu/wae
Wasserstein Auto-Encoders