Pinned Repositories
1171000108
bert
TensorFlow code and pre-trained models for BERT
BERT-pytorch
Google AI 2018 BERT pytorch implementation
ConMask
ConMask model described in paper Open-world Knowledge Graph Completion.
Deep-Learning-Interview-Book
深度学习面试宝典(含数学、机器学习、深度学习、计算机视觉、自然语言处理和SLAM等方向)
hit-1160300811
Open Source Software
hit-1171000108
keras-contrib
Keras community contributions
leetcode-master
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
named_entity_recognition
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
HIT-SCIR-xuanxuan's Repositories
HIT-SCIR-xuanxuan/1171000108
HIT-SCIR-xuanxuan/bert
TensorFlow code and pre-trained models for BERT
HIT-SCIR-xuanxuan/BERT-pytorch
Google AI 2018 BERT pytorch implementation
HIT-SCIR-xuanxuan/ConMask
ConMask model described in paper Open-world Knowledge Graph Completion.
HIT-SCIR-xuanxuan/Deep-Learning-Interview-Book
深度学习面试宝典(含数学、机器学习、深度学习、计算机视觉、自然语言处理和SLAM等方向)
HIT-SCIR-xuanxuan/hit-1160300811
Open Source Software
HIT-SCIR-xuanxuan/hit-1171000108
HIT-SCIR-xuanxuan/keras-contrib
Keras community contributions
HIT-SCIR-xuanxuan/leetcode-master
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
HIT-SCIR-xuanxuan/named_entity_recognition
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
HIT-SCIR-xuanxuan/NLP-Interview-Notes
本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
HIT-SCIR-xuanxuan/OpenKS
OpenKS - 领域可泛化的知识学习与计算引擎
HIT-SCIR-xuanxuan/PaddleNLP
An NLP library with Awesome pre-trained Transformer models and easy-to-use interface, supporting wide-range of NLP tasks from research to industrial applications.
HIT-SCIR-xuanxuan/spring-framework
Spring Framework
HIT-SCIR-xuanxuan/testing-library-docs
docs site for *-testing-library
HIT-SCIR-xuanxuan/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
HIT-SCIR-xuanxuan/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities