Basic-Char-RNN
ACT-Char-RNN: Implement the paper Adaptive Computation Time for Recurrent Neural Networks with TensorFlow. However, I get a reverse phenomenon contrast to the ACT paper. The neural network ponders less on space charecter than others. For example:
char | remaining probility | iteration times |
---|---|---|
t | 0.144139 | 4 |
o | 0.069154 | 7 |
0.245744 | 2 | |
m | 0.101206 | 4 |
o | 0.149854 | 3 |
v | 0.172533 | 2 |
e | 0.189886 | 3 |
0.309559 | 2 | |
t | 0.016886 | 8 |
h | 0.063482 | 7 |
e | 0.145703 | 5 |
i | 0.445629 | 2 |
r | 0.194783 | 3 |
The ACT code is inspired and adjusted from DeNeutoy and abhitopia.