Pinned Repositories
Category-theory-materials
非常的新鲜,非常的美味
group-project
6
nano-BERT-mix
apply NSP and MLM to pretrain in a very urgly way
short-text-classification
This repository contains code to reproduce the results in our paper "Transformers are Short Text Classifiers: A Study of Inductive Short Text Classifiers on Benchmarks and Real-world Datasets".
SemiObserver's Repositories
SemiObserver/Category-theory-materials
非常的新鲜,非常的美味
SemiObserver/group-project
6
SemiObserver/nano-BERT-mix
apply NSP and MLM to pretrain in a very urgly way
SemiObserver/short-text-classification
This repository contains code to reproduce the results in our paper "Transformers are Short Text Classifiers: A Study of Inductive Short Text Classifiers on Benchmarks and Real-world Datasets".