hc495
Machine / Deep Learning Theory. Ph.D. Student
Japan Advanced Institute of Science and TechnologyIshikawa, Japan
Pinned Repositories
adnmb-Ig7nipt-Hehugui-backup
A岛阖壶鬼团备份,数据来源ftoovvr.github.io,原作者Ig7nipt
bert
TensorFlow code and pre-trained models for BERT
bertviz
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
BIT_885_Problems
Beijing Institute of Technology - 885 - 北京理工大学“软件工程专业基础综合(885)”初试编程题及解答 - C语言描述
collaborative-attention
Code for Multi-Head Attention: Collaborate Instead of Concatenate
jekyll-theme-WuK
External module of my personal page / 个人页面支持模块
NoisyICL
Numerical_Analysis_Notes
数值分析笔记,请随意使用 >.<
PioLive2D
External module of my personal page / 个人页面支持模块,托管至jsdelivr
Simple_Calculator
Compilation principle exercise: simple calculator
hc495's Repositories
hc495/BIT_885_Problems
Beijing Institute of Technology - 885 - 北京理工大学“软件工程专业基础综合(885)”初试编程题及解答 - C语言描述
hc495/Numerical_Analysis_Notes
数值分析笔记,请随意使用 >.<
hc495/bert
TensorFlow code and pre-trained models for BERT
hc495/jekyll-theme-WuK
External module of my personal page / 个人页面支持模块
hc495/NoisyICL
hc495/PioLive2D
External module of my personal page / 个人页面支持模块,托管至jsdelivr
hc495/Simple_Calculator
Compilation principle exercise: simple calculator
hc495/adnmb-Ig7nipt-Hehugui-backup
A岛阖壶鬼团备份,数据来源ftoovvr.github.io,原作者Ig7nipt
hc495/bertviz
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
hc495/collaborative-attention
Code for Multi-Head Attention: Collaborate Instead of Concatenate
hc495/I491E_Project
hc495/Image_Caption
只是作业罢了
hc495/Journal-Information
The information of computer journal
hc495/keras-bert
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
hc495/learned_optimization
hc495/math-learning
数学学习书籍
hc495/MetaICL
An original implementation of "MetaICL Learning to Learn In Context" by Sewon Min, Mike Lewis, Luke Zettlemoyer and Hannaneh Hajishirzi
hc495/music-generation-with-DL
Resources on Music Generation with Deep Learning
hc495/revisit-bert-finetuning
For the code release of our arXiv paper "Revisiting Few-sample BERT Fine-tuning" (https://arxiv.org/abs/2006.05987).
hc495/StaICC
A standardized toolkit for classification task on In-context Learning.
hc495/TextGuide
A text truncation method, useful for instance in long text classification
hc495/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
hc495/tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.