Pinned Repositories
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
iMoLD
Official implementation for Learning Invariant Molecular Representation in Latent Discrete Space (NeurIPS 2023)
PH-Reg
The code of "Deep Regression Representation Learning with Topology" in ICML 2024
2022_wechat_bigdata_rank19
mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
oscar_caption
MMD-VAE
Pytorch implementation of Maximum Mean Discrepancy Variational Autoencoder, a member of the InfoVAE family that maximizes Mutual Information between the Isotropic Gaussian Prior (as the latent space) and the Data Distribution.
Domain-Adaptation-Regression
Code release for Representation Subspace Distance for Domain Adaptation Regression (ICML 2021)
MoleOOD
Official implementation for the paper "Learning Substructure Invariance for Out-of-Distribution Molecular Representations" (NeurIPS 2022).
Maximum-Mean-Discrepancy-Variational-Autoencoder
A PyTorch implementation of the MMD-VAE, an Information-Maximizing Variational Autoencoder (InfoVAE) based off of the TensorFlow implementation published by the author of the original InfoVAE paper.
panmianzhi's Repositories
panmianzhi/oscar_caption
panmianzhi/2022_wechat_bigdata_rank19
panmianzhi/mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538