brightmart/bert_language_understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Python
Issues
- 5
你的textcnn实验,只要把学习率调大一个数量级,没有预训练也可以得到一样的结果
#18 opened by wykdg - 1
运行run_classifier_predict_online.py出错
#25 opened by Vincent131499 - 0
using bert with unused/discarded data
#24 opened by Priyansh2 - 4
run_classifier_predict_online.py
#16 opened by bihui9968 - 1
error in run_classifier_predict_online.py
#22 opened by xikunlun001 - 0
about pre_train
#21 opened by charlesfufu - 0
- 0
关于执行pre_train命令的问题
#17 opened by c0derm4n - 0
- 1
What is the accuracy of mask word prediction?
#14 opened by guotong1988 - 4
different result when set lr to 0.001
#13 opened by liu-nlper - 1
About "scaled_dot_product_attention_batch"
#9 opened by yuanxiaosc - 1
tokenize_style=char的问题
#10 opened by godfatherzzx - 1
The reason why pretrain does work.
#12 opened by guotong1988 - 2
question about learning rate
#11 opened by liu-nlper - 2
Bert_model doesn't work
#7 opened by geogreff - 2
the pre-trained MLM performance
#6 opened by yyht - 5
question on the implement of bert_cnn_model
#4 opened by zhhongzhi - 1
pre-trained word embedding的问题
#5 opened by ethereal666 - 2
What is the final f1 performance?
#3 opened by guotong1988 - 3
Three undefined names
#2 opened by cclauss