Calculate Softmax layer of Attention in
You only need numpy >=1.18
.
For example,
import numpy as np
from functions import normal_softmax, lsh_softmax
R = np.random.randn(100, 10000)
normal_sm = normal_softmax(R)
lsh_sm = lsh_softmax(R)
Note: For better visibility, the diagonal components are rewritten to 0
The execution times are plotted for sequence lengths of