Issues
- 1
你好,这个注意力有没有使用的例子呢?如何调用注意力层呢?
#21 opened by Lerry123 - 2
a = K.permute_dimensions(a, (0, 3, 2, 1))
#19 opened by monkeyshichi - 2
Keras的batch_dot()如何用pytorch改写
#20 opened by linjingxu - 3
- 2
- 2
请问Q,K,V分别是从哪里输入的呢?
#14 opened by Ironeie - 13
K.permute_dimensions(A, (0,3,2,1))运行时报错
#3 opened by njuccq - 1
I have no idea what method "to_mask" helps.
#17 opened by fatLime - 0
- 2
- 2
- 0
attention_keras.py
#13 opened by myndtt - 2
- 4
tf中position
#11 opened by lrx1213 - 1
- 1
感觉好像少了参数Wo
#8 opened by duanyu - 1
行为识别 用attention
#9 opened by guofuzheng - 1
How to understand the decoder input(target)?
#6 opened by zoe218 - 1
- 1
Position_Embedding肯能有些问题
#2 opened by liangoy - 3
请问 Position Embedding 可以被训练吗?
#1 opened by ZJUguquan