Pinned Repositories
3d-shapes
This repository contains the 3D shapes dataset, used in Kim, Hyunjik and Mnih, Andriy. "Disentangling by Factorising." In Proceedings of the 35th International Conference on Machine Learning (ICML). 2018. to assess the disentanglement properties of unsupervised learning methods.
benchmark_VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Beta-VAE
Pytorch implementation of β-VAE
coil_gecco22
code for paper: Peter J Bentley, Soo Ling Lim, Adam Gaier and Linh Tran. 2022. COIL: Constrained Optimization in Workshop on Learned Latent Space: Learning Representations for Valid Solutions. In Genetic and Evolutionary Computation Conference Companion (GECCO ’22 Companion). ACM, Boston, USA
DALLE-pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
dsprites-dataset
Dataset to assess the disentanglement properties of unsupervised learning methods
g-means
mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Purdue-ECE-562-Database-Management-Fall-2017
Taught by Arif Ghafoor
Purdue-ECE-661-Computer-Vision-Fall-2018
Taught by Avinash Kak
supershiye's Repositories
supershiye/Purdue-ECE-661-Computer-Vision-Fall-2018
Taught by Avinash Kak
supershiye/Purdue-ECE-562-Database-Management-Fall-2017
Taught by Arif Ghafoor
supershiye/Beta-VAE
Pytorch implementation of β-VAE
supershiye/3d-shapes
This repository contains the 3D shapes dataset, used in Kim, Hyunjik and Mnih, Andriy. "Disentangling by Factorising." In Proceedings of the 35th International Conference on Machine Learning (ICML). 2018. to assess the disentanglement properties of unsupervised learning methods.
supershiye/benchmark_VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
supershiye/coil_gecco22
code for paper: Peter J Bentley, Soo Ling Lim, Adam Gaier and Linh Tran. 2022. COIL: Constrained Optimization in Workshop on Learned Latent Space: Learning Representations for Valid Solutions. In Genetic and Evolutionary Computation Conference Companion (GECCO ’22 Companion). ACM, Boston, USA
supershiye/DALLE-pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
supershiye/dsprites-dataset
Dataset to assess the disentanglement properties of unsupervised learning methods
supershiye/g-means
supershiye/mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
supershiye/mae_st
Official Open Source code for "Masked Autoencoders As Spatiotemporal Learners"
supershiye/Purdue-MA-598-Neural-Network
supershiye/pytorch-vq-vae
PyTorch implementation of VQ-VAE by Aäron van den Oord et al.
supershiye/pytorch-vqvae
Vector Quantized VAEs - PyTorch Implementation
supershiye/stablediffusion
High-Resolution Image Synthesis with Latent Diffusion Models
supershiye/torch-fidelity
High-fidelity performance metrics for generative models in PyTorch
supershiye/vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
supershiye/VQ-Diffusion
Official implementation of VQ-Diffusion
supershiye/vq-vae-2-pytorch
Implementation of Generating Diverse High-Fidelity Images with VQ-VAE-2 in PyTorch
supershiye/wasserstein-auto-encoder
A brief tutorial on the Wasserstein auto-encoder