CyberZHG/keras-self-attention

__init__() missing 3 required positional arguments: 'node_def', 'op', and 'message'

dingtine opened this issue · 1 comments

when i used the SeqSelfAttention function, the code return this error: init() missing 3 required positional arguments: 'node_def', 'op', and 'message', how to fix this ?

stale commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.