Pinned Repositories
AiLearning
AiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
awesome-algorithm
LeetCode, HackRank, 剑指offer, classic algorithm implementation
bert
TensorFlow code and pre-trained models for BERT
BERT-keras
Keras implementation of BERT(Bidirectional Encoder Representations from Transformers)
BERT-pytorch
Google AI 2018 BERT pytorch implementation
bert_language_understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding
bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
bitcoin-bubble-index
A visualization analysis tool for price bubble of Bitcoin, including basic price information, 60-days accumulative increase, hot keywords index, and bubble index.
cw2vec
Implementation of the cw2vec model
ShuGao0810's Repositories
ShuGao0810/cw2vec
Implementation of the cw2vec model
ShuGao0810/AiLearning
AiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
ShuGao0810/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
ShuGao0810/awesome-algorithm
LeetCode, HackRank, 剑指offer, classic algorithm implementation
ShuGao0810/bert
TensorFlow code and pre-trained models for BERT
ShuGao0810/BERT-keras
Keras implementation of BERT(Bidirectional Encoder Representations from Transformers)
ShuGao0810/BERT-pytorch
Google AI 2018 BERT pytorch implementation
ShuGao0810/bert_language_understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding
ShuGao0810/bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
ShuGao0810/bitcoin-bubble-index
A visualization analysis tool for price bubble of Bitcoin, including basic price information, 60-days accumulative increase, hot keywords index, and bubble index.
ShuGao0810/cocoNLP
A Chinese information extraction tool.
ShuGao0810/CS224n-2019-solutions
Complete solutions for Stanford CS224n, winter, 2019
ShuGao0810/EventTriplesExtraction
An experiment and demo-level tool for text information extraction (event-triples extraction), which can be a route to the event chain and topic graph, 基于依存句法与语义角色标注的事件三元组抽取,可用于文本理解如文档主题链,事件线等应用。
ShuGao0810/finetune-transformer-lm
Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
ShuGao0810/language
Shared repository for open-sourced projects from the Google AI Language team.
ShuGao0810/nlp_base
自然语言基础模型
ShuGao0810/nlp_overview
Overview of Modern Deep Learning Techniques Applied to Natural Language Processing
ShuGao0810/nndl-codes
Sample Codes for NNDL
ShuGao0810/nndl.github.io
《神经网络与深度学习》 Neural Network and Deep Learning
ShuGao0810/paip-lisp
Lisp code for the textbook "Paradigms of Artificial Intelligence Programming"
ShuGao0810/pytorch-openai-transformer-lm
A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
ShuGao0810/tensor2tensor
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
ShuGao0810/transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
ShuGao0810/USTC-Course
:heart:**科学技术大学课程资源
ShuGao0810/word2vec
Python interface to Google word2vec