CyberZHG/keras-self-attention

ValueError: Shapes (None, 3) and (None, 50, 3) are incompatible

Keramatfar opened this issue · 2 comments

Hi,
I am getting this error in the last layer of my simple LSTM network when i add self-attention according to your examples.

stale commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

Hi there, I resolved this issue by adding a lambda layer to reduce the dimension:
Lambda(lambda x: x[:, -1, :])(attn_layer)