Pinned Repositories
Auto-6ML
Auto^6ML is a jittor library allowing users to achieve machine learning automation.
awesome-AutoML
A curated list of AutoML papers/tutorials/slides etc.
CMW-Net
Pytorch implementation of TPAMI2023: CMW-NetCMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep Learning
Meta-SPL
Pytorch implementation for Meta-SPL (self-paced learning).
meta-weight-net
NeurIPS'19: Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting (Pytorch implementation for noisy labels).
Meta-weight-net_class-imbalance
NeurIPS'19: Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting (Pytorch implementation for class imbalance).
MLR-SNet
This is an official PyTorch implementation of MLR-SNet: Transferable LR Schedules for Heterogeneous Tasks
Multitask-Learning
Multitask Learning Resources
Probabilistic-MW-Net
TNNLS2021: A Probabilistic Formulation for Meta-Weight-Net (Pytorch implementation for noisy labels)
SLeM-Theory
The implementation of meta-regularization proposed in SLeM theory paper "Learning an Explicit Hyper-parameter Prediction Function Conditioned on Tasks".
xjtushujun's Repositories
xjtushujun/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, online EWC, SI, LwF, DGR, DGR+distill, RtF, iCaRL).
xjtushujun/meta-transfer-learning-tensorflow
TensorFlow implementation for "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
xjtushujun/awesome-AutoML-and-Lightweight-Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
xjtushujun/Awesome-Learning-with-Label-Noise
A curated list of resources for Learning with Noisy Labels
xjtushujun/Awesome-Meta-Learning-1
A curated list of Meta Learning papers, code, books, blogs, videos, datasets and other resources.
xjtushujun/awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
xjtushujun/awesome-zero-shot-learning
A curated list of papers, code and resources pertaining to zero shot learning
xjtushujun/CloserLookFewShot
source code to ICLR'19, 'A Closer Look at Few-shot Classification'
xjtushujun/cost-sensitive-learning
xjtushujun/coteaching_plus
ICML'19: How does Disagreement Help Generalization against Label Corruption?
xjtushujun/DANN
pytorch implementation of Domain-Adversarial Training of Neural Networks
xjtushujun/disentangling-vae
Experiments for understanding disentanglement in VAE latent representations
xjtushujun/fast-autoaugment
Official Implementation of 'Fast AutoAugment' in PyTorch.
xjtushujun/Feature_Critic
Feature-Critic Networks for Heterogeneous Domain Generalisation
xjtushujun/few-shot
Repository for few-shot learning machine learning projects
xjtushujun/few-shot-ssl-public
Meta Learning for Semi-Supervised Few-Shot Classification
xjtushujun/FewShotWithoutForgetting
xjtushujun/google-research
Google AI Research
xjtushujun/HowToTrainYourMAMLPytorch
The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.
xjtushujun/importance-reweighting
Liu T, Tao D. Classification with Noisy Labels by Importance Reweighting[J]中IW方法实现
xjtushujun/iNaturalist-2019-Fine-grained-Classification-Competition
xjtushujun/iNaturalist_2019
This code built for the Competition of iNaturalist 2019 at FGVC6
xjtushujun/l4-optimizer
Code for paper "L4: Practical loss-based stepsize adaptation for deep learning"
xjtushujun/MAML-Pytorch
Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
xjtushujun/MER
Fork of the GEM project (https://github.com/facebookresearch/GradientEpisodicMemory) including Meta-Experience Replay (MER) methods from the ICLR 2019 paper (https://openreview.net/pdf?id=B1gTShAct7)
xjtushujun/MLNT
Meta-Learning based Noise-Tolerant Training
xjtushujun/Multi-tasking_Learning_With_Unreliable_Labels
Extending the NLNN algorithm proposed by Bekker & Goldbergers in a Multi-tasking Learning set-up to handle noisy labels. In order to extend low-resource data we often used artificial annotators. In this following setup we aim to generate clean training labeled data from artificial annotators.
xjtushujun/NAO_pytorch
Pytorch Implementation of Neural Architecture Optimization
xjtushujun/pytorch-lars
PyTorch implementation of LARS (Layer-wise Adaptive Rate Scaling)
xjtushujun/VITutorial
This repository stores slides for a tutorial on variational inference for NLP audiences.