Pinned Repositories
BERT_FP
Fine-grained Post-training for Improving Retrieval-based Dialogue Systems - NAACL 2021
flax
Flax is a neural network library for JAX that is designed for flexibility.
IncrementalFSTC
Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset and System. NAACL 2021. https://arxiv.org/abs/2104.11882
jaxformer
Minimal library to train LLMs on TPU in JAX with pjit().
Multi-Grained-NER
Multi-Grained Named Entity Recognition (ACL 2019)
ParlAI
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
RL4LMs
A modular RL library to fine-tune language models to human preferences
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
ZeroShotCapsule
Code for Paper "Zero-shot User Intent Detection via Capsule Neural Networks".
llm-foundry
LLM training code for Databricks foundation models
congyingxia's Repositories
congyingxia/ZeroShotCapsule
Code for Paper "Zero-shot User Intent Detection via Capsule Neural Networks".
congyingxia/Multi-Grained-NER
Multi-Grained Named Entity Recognition (ACL 2019)
congyingxia/IncrementalFSTC
Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset and System. NAACL 2021. https://arxiv.org/abs/2104.11882
congyingxia/BERT_FP
Fine-grained Post-training for Improving Retrieval-based Dialogue Systems - NAACL 2021
congyingxia/flax
Flax is a neural network library for JAX that is designed for flexibility.
congyingxia/jaxformer
Minimal library to train LLMs on TPU in JAX with pjit().
congyingxia/ParlAI
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
congyingxia/RL4LMs
A modular RL library to fine-tune language models to human preferences
congyingxia/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.