senkey705's Stars
5KG-UCAS/CosmeticsKG
一个以化妆品为主题的知识图谱项目。目前包含3000节点,15000边,口红和香水两种品类。支持图谱检索、智能问答。A Knowledge Graph project about cosmetics. At present, there are 3000 nodes, 15000 edges and two categories (lipstick and perfume). Support KG visualization and Q & A system.
PaddlePaddle/ERNIE
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
renmada/t5-pegasus-pytorch
lucidrains/reformer-pytorch
Reformer, the efficient Transformer, in Pytorch
wjunneng/2020-AI-Financial-User-Review-Categories
2020 AI研习社 金融用户评论分类
lonePatient/awesome-pretrained-chinese-nlp-models
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
JunnYu/RoFormer_pytorch
RoFormer V1 & V2 pytorch
bojone/bert4keras
keras implement of transformers for humans
mali19064/LSTM-CRF-pytorch-faster
A more than 1000X faster paralleled LSTM-CRF implementation modified from the slower version in the Pytorch official tutorial (URL:https://pytorch.org/tutorials/beginner/nlp/advanced_tutorial.html).
xy2333/O2O
天池大赛:O2O优惠券使用预测(排名:前1%,AUC:0.7948)(Top1:0.8116)
DesertsX/yulequan-relations-graph
明星关系图谱 体验网址:
ymcui/Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
circlePi/Bert_Chinese_Ner_pytorch
Named Recognition Entity based on BERT and CRF 基于BERT+CRF的中文命名实体识别
ProHiryu/bert-chinese-ner
使用预训练语言模型BERT做中文NER
Arsey/keras-transfer-learning-for-oxford102
Keras pretrained models (VGG16, InceptionV3, Resnet50, Resnet152) + Transfer Learning for predicting classes in the Oxford 102 flower dataset
bharathgs/Awesome-pytorch-list
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
google-research/bert
TensorFlow code and pre-trained models for BERT