Gosicfly's Stars
ShannonAI/glyce
Code for NeurIPS 2019 - Glyce: Glyph-vectors for Chinese Character Representations
tengkz/tensorflow_notes
tensorflow源码阅读笔记
textflint/textflint
Unified Multilingual Robustness Evaluation Toolkit for Natural Language Processing
coetaur0/ESIM
Implementation of the ESIM model for natural language inference with PyTorch
vnpy/vnpy
基于Python的开源量化交易平台开发框架
wangshub/RL-Stock
📈 如何用深度强化学习自动炒股
coder2gwy/coder2gwy
互联网首份程序员考公指南,由3位已经进入体制内的前大厂程序员联合献上。
thu-coai/NLG_book
书籍《现代自然语言生成》介绍
thunlp/ERNIE
Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities"
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
datawhalechina/key-book
《机器学习理论导引》(宝箱书)的证明、案例、概念补充与参考文献讲解。
yuanxiaosc/BERT-for-Sequence-Labeling-and-Text-Classification
This is the template code to use BERT for sequence lableing and text classification, in order to facilitate BERT for more tasks. Currently, the template code has included conll-2003 named entity identification, Snips Slot Filling and Intent Prediction.
smoothnlp/SmoothNLP
专注于可解释的NLP技术 An NLP Toolset With A Focus on Explainable Inference
bytedance/effective_transformer
Running BERT without Padding
Yaozeng/TinyBERT
brightmart/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
huiyadanli/RevokeMsgPatcher
:trollface: A hex editor for WeChat/QQ/TIM - PC版微信/QQ/TIM防撤回补丁(我已经看到了,撤回也没用了)
zhihu/cuBERT
Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
goto456/stopwords
中文常用停用词表(哈工大停用词表、百度停用词表等)
danan0755/Bert_Classifier
bert文本分类,ner, albert,keras_bert,bert4keras,kashgari,fastbert,flask + uwsgi + keras部署模型,时间实体识别,tfidf关键词抽取,tfidf文本相似度,用户情感分析
googlehosts/hosts
镜像:https://scaffrey.coding.net/p/hosts/git / https://git.qvq.network/googlehosts/hosts
LeeSureman/Flat-Lattice-Transformer
code for ACL 2020 paper: FLAT: Chinese NER Using Flat-Lattice Transformer
425776024/nlpcda
一键中文数据增强包 ; NLP数据增强、bert数据增强、EDA:pip install nlpcda
BrikerMan/Kashgari
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
BrikerMan/Kashgari-doc-zh
Kashgari 框架的中文文档
luopeixiang/named_entity_recognition
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
JayYip/m3tl
BERT for Multitask Learning
kpe/bert-for-tf2
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
NVIDIA/DeepLearningExamples
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.