Pinned Repositories
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
iMoLD
Official implementation for Learning Invariant Molecular Representation in Latent Discrete Space (NeurIPS 2023)
2022_wechat_bigdata_rank19
IMG
to store the images
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
oscar_caption
panmianzhi.github.io
This is my first repository
MMD-VAE
Pytorch implementation of Maximum Mean Discrepancy Variational Autoencoder, a member of the InfoVAE family that maximizes Mutual Information between the Isotropic Gaussian Prior (as the latent space) and the Data Distribution.
MoleOOD
Official implementation for the paper "Learning Substructure Invariance for Out-of-Distribution Molecular Representations" (NeurIPS 2022).
Maximum-Mean-Discrepancy-Variational-Autoencoder
A PyTorch implementation of the MMD-VAE, an Information-Maximizing Variational Autoencoder (InfoVAE) based off of the TensorFlow implementation published by the author of the original InfoVAE paper.
panmianzhi's Repositories
panmianzhi/oscar_caption
panmianzhi/panmianzhi.github.io
This is my first repository
panmianzhi/2022_wechat_bigdata_rank19
panmianzhi/IMG
to store the images
panmianzhi/mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538