Pinned Repositories
-
AAAI18-code
The code of AAAI18 paper "Learning Structured Representation for Text Classification via Reinforcement Learning".
adversarial-multi-criteria-learning-for-CWS
The implementation of paper https://arxiv.org/abs/1704.07556, ACL 2017
AspectSentimentSemEval
EventExtraction
EventExtraction_system
lstm-crf-ner
machine_learning_python
通过阅读网上的资料代码,进行自我加工,努力实现常用的机器学习算法。实现算法有KNN、Kmeans、EM、Perceptron、决策树、逻辑回归、svm、adaboost、朴素贝叶斯
pycorrector
pycorrector is a toolkit for text error correction. 文本纠错,Kenlm,Seq2Seq_Attention,BERT,MacBERT,ELECTRA,ERNIE,Transformer等模型实现,开箱即用。
pytorch-pretrained-BERT
📖The Big-&-Extending-Repository-of-Transformers: Pretrained PyTorch models for Google's BERT, OpenAI GPT & GPT-2, Google/CMU Transformer-XL.
qfzxhy's Repositories
qfzxhy/machine_learning_python
通过阅读网上的资料代码,进行自我加工,努力实现常用的机器学习算法。实现算法有KNN、Kmeans、EM、Perceptron、决策树、逻辑回归、svm、adaboost、朴素贝叶斯
qfzxhy/pycorrector
pycorrector is a toolkit for text error correction. 文本纠错,Kenlm,Seq2Seq_Attention,BERT,MacBERT,ELECTRA,ERNIE,Transformer等模型实现,开箱即用。
qfzxhy/pytorch-pretrained-BERT
📖The Big-&-Extending-Repository-of-Transformers: Pretrained PyTorch models for Google's BERT, OpenAI GPT & GPT-2, Google/CMU Transformer-XL.
qfzxhy/Automatic-Corpus-Generation
This repository is for the paper "A Hybrid Approach to Automatic Corpus Generation for Chinese Spelling Check"
qfzxhy/AutoPhraseX
Automated Phrase Mining from Massive Text Corpora in Python.
qfzxhy/bert
TensorFlow code and pre-trained models for BERT
qfzxhy/bert_attn_viz
Visualize BERT's self-attention layers on text classification tasks
qfzxhy/bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
qfzxhy/chaizi
漢語拆字字典
qfzxhy/ChineseNER
中文NER的那些事儿
qfzxhy/ChineseNLPCorpus
中文自然语言处理数据集,平时做做实验的材料。欢迎补充提交合并。
qfzxhy/DeepKE
An Open Toolkit for Knowledge Graph Extraction and Construction published at EMNLP2022 System Demonstrations.
qfzxhy/electra
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
qfzxhy/LEBERT
Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
qfzxhy/LLM-Finetuning
LLM Finetuning with peft
qfzxhy/MacBERT
Revisiting Pre-trained Models for Chinese Natural Language Processing (Findings of EMNLP)
qfzxhy/multihead_joint_entity_relation_extraction
qfzxhy/NJUParser-pytorch
a collection of re-implementation of the latest state-of-the-art methods about constituency parsing with pytorch
qfzxhy/OpenNRE
Neural Relation Extraction implemented in TensorFlow
qfzxhy/PLOME
Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021
qfzxhy/python-tutorial
Python实用教程,包括:Python基础,Python高级特性,面向对象编程,多线程,数据库,数据科学,Flask,爬虫开发教程。
qfzxhy/sidentify-bert-based-word-level
这是信息抽取中的第一个步骤,识别所包含的关系,总体是一个多分类
qfzxhy/simpletransformers
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
qfzxhy/Soft-Masked-BERT
qfzxhy/test2
qfzxhy/text_gcn
Graph Convolutional Networks for Text Classification. AAAI 2019
qfzxhy/toutiao-multilevel-text-classfication-dataset
今日头条中文新闻文本(多层)分类数据集
qfzxhy/TRAN
qfzxhy/transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
qfzxhy/transformers-ner
Pytorch-Named-Entity-Recognition-with-transformers