THUNLP-MT
Machine Translation Group, Natural Language Processing Lab at Tsinghua University (THUNLP). Please refer to https://github.com/thunlp for more NLP resources.
Tsinghua University, Beijing, China
Pinned Repositories
Document-Transformer
Improving the Transformer translation model with document-level context
dyMEAN
This repo contains the codes for our paper "End-to-End Full-Atom Antibody Design"
Mask-Align
Code for our paper "Mask-Align: Self-Supervised Neural Word Alignment" in ACL 2021
MEAN
This repo contains the codes for our paper Conditional Antibody Design as 3D Equivariant Graph Translation.
MT-Reading-List
A machine translation reading list maintained by Tsinghua Natural Language Processing Group
PS-VAE
This repo contains the codes for our paper: Molecule Generation by Principal Subgraph Mining and Assembling.
StableToolBench
A new tool learning benchmark aiming at well-balanced stability and reality, based on ToolBench.
TG-Reading-List
A text generation reading list maintained by Tsinghua Natural Language Processing Group.
THUCC
An open-source classical Chinese information processing toolkit developed by Tsinghua Natural Language Processing Group
THUMT
An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
THUNLP-MT's Repositories
THUNLP-MT/MT-Reading-List
A machine translation reading list maintained by Tsinghua Natural Language Processing Group
THUNLP-MT/THUMT
An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
THUNLP-MT/StableToolBench
A new tool learning benchmark aiming at well-balanced stability and reality, based on ToolBench.
THUNLP-MT/dyMEAN
This repo contains the codes for our paper "End-to-End Full-Atom Antibody Design"
THUNLP-MT/MEAN
This repo contains the codes for our paper Conditional Antibody Design as 3D Equivariant Graph Translation.
THUNLP-MT/PS-VAE
This repo contains the codes for our paper: Molecule Generation by Principal Subgraph Mining and Assembling.
THUNLP-MT/GET
This repo contains the codes for our paper "Generalist Equivariant Transformer Towards 3D Molecular Interaction Learning" (ICML 2024).
THUNLP-MT/PepGLAD
Codes for our paper "Full-Atom Peptide Design with Geometric Latent Diffusion" (NeurIPS 2024)
THUNLP-MT/Template-NMT
THUNLP-MT/SKR
Self-Knowledge Guided Retrieval Augmentation for Large Language Models (EMNLP Findings 2023)
THUNLP-MT/StreamingBench
THUNLP-MT/PLM4MT
Code for our work "MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators" in ACL 2022
THUNLP-MT/ModelCompose
Official code for our paper "Model Composition for Multimodal Large Language Models"
THUNLP-MT/PGRA
Prompt-Guided Retrieval For Non-Knowledge-Intensive Tasks
THUNLP-MT/PromptGating4MCTG
This is the repo for our work “An Extensible Plug-and-Play Method for Multi-Aspect Controllable Text Generation” (ACL 2023).
THUNLP-MT/Brote
THUNLP-MT/CODIS
Repo for paper "CODIS: Benchmarking Context-Dependent Visual Comprehension for Multimodal Large Language Models".
THUNLP-MT/FIIG
Filling the Image Information Gap for VQA: Prompting Large Language Models to Proactively Ask Questions (EMNLP 2023 Findings)
THUNLP-MT/ktnmt
THUNLP-MT/TRAN
This is the repo for our work “Failures Pave the Way: Enhancing Large Language Models through Tuning-free Rule Accumulation” (EMNLP 2023).
THUNLP-MT/DBKD-PLM
Codebase for ACL 2023 conference long paper Bridging the Gap between Decision and Logits in Decision-based Knowledge Distillation for Pre-trained Language Models.
THUNLP-MT/symbol2language
Speak It Out: Solving Symbol-Related Problems with Symbol-to-Language Conversion for Language Models
THUNLP-MT/DEEM
THUNLP-MT/MetaRanking
Official code repo for our work "Meta Ranking: Less Capable Language Models are Capable for Single Response Judgement".
THUNLP-MT/RiC
THUNLP-MT/ROGO
This repo contains the codes for our work “Restricted orthogonal gradient projection for continual learning”.
THUNLP-MT/ActiView
THUNLP-MT/BTP
THUNLP-MT/CKD
Continual Knowledge Distillation for Neural Machine Translation
THUNLP-MT/PANDA