mbiesialska's Stars
marionbartl/gender-bias-BERT
This repository holds the code for my master thesis entitles "The Association of Gender Bias with BERT - Measuring, Mitigating and Cross-lingual portability"
illidanlab/personaGPT
Implementation of PersonaGPT Dialog Model
adbar/trafilatura
Python & Command-line tool to gather text and metadata on the Web: Crawling, scraping, extraction, output as CSV, JSON, HTML, MD, TXT, XML
pliang279/sent_debias
[ACL 2020] Towards Debiasing Sentence Representations
dragen1860/MAML-Pytorch
Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
shaoxia57/Bias_in_Gendered_Languages
This is a repo for the EMNLP 19 Paper on gender bias in gendered languages.
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
PaddlePaddle/ERNIE
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
VainF/Awesome-Contrastive-Learning
Awesome Contrastive Learning for CV & NLP
labmlai/annotated_deep_learning_paper_implementations
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
yuanli2333/Teacher-free-Knowledge-Distillation
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
AmanDaVinci/lifelong-learning-limitations
Limitations of existing lifelong learning frameworks
AmanDaVinci/lifelong-learning-baselines
Baselines for lifelong learning
AmanDaVinci/lifelong-learning
Lifelong Learning for Language Models
lebrice/Sequoia
The Research Tree - A playground for research at the intersection of Continual, Reinforcement, and Self-Supervised Learning.
LeslieOverfitting/selective_distillation
chrhenning/hypercl
Continual Learning with Hypernetworks. A continual learning approach that has the flexibility to learn a dedicated set of parameters, fine-tuned for every task, that doesn't require an increase in the number of trainable weights and is robust against catastrophic forgetting.
caoy1996/CL-NMT
ServiceNow/osaka
Codebase for "Online Fast Adaptation and Knowledge Accumulation: a New Approach to Continual Learning". This is a ServiceNow Research project that was started at Element AI.
tristandeleu/pytorch-maml
An Implementation of Model-Agnostic Meta-Learning in PyTorch with Torchmeta
tristandeleu/pytorch-meta
A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
dbaranchuk/memory-efficient-maml
Memory efficient MAML using gradient checkpointing
cbfinn/maml
Code for "Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks"
chaoyanghe/Awesome-Federated-Learning
FedML - The Research and Production Integrated Federated Learning Library: https://fedml.ai
llmt-wmt/allies_llmt_beat
The ALLIES Lifelong Learning Machine Translation baseline system's repository
mattriemer/MER
Fork of the GEM project (https://github.com/facebookresearch/GradientEpisodicMemory) including Meta-Experience Replay (MER) methods from the ICLR 2019 paper (https://openreview.net/pdf?id=B1gTShAct7)
SALT-NLP/IDBR
Codes for the paper: "Continual Learning for Text Classification with Information Disentanglement Based Regularization"
ej0cl6/deep-active-learning
Deep Active Learning
JordanAsh/badge
An implementation of the BADGE batch active learning algorithm.
forest-snow/alps
Code accompanying EMNLP 2020 paper "Cold-start Active Learning through Self-supervised Language Modeling".