tensorflow/addons

seq2seq.BahdanauAttention raise TypeError: probability_fn is not an instance of str

BrandonStudio opened this issue · 9 comments

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): both Windows and Colab
  • TensorFlow version and how it was installed (source or binary): stable latest, pip
  • TensorFlow-Addons version and how it was installed (source or binary): stable latest
  • Python version: 3.10 / 3.9
  • Is GPU used? (yes/no): yes

Describe the bug

call BahdanauAttention with default probability_fn value ("softmax") raises type error

Tried to debug and found that probability_fn was a function when checking type

Code to reproduce the issue

import tensorflow_addons as tfa
tfa.seq2seq.BahdanauAttention(1, 1, 1, normalize=False, name='BahdanauAttention')

Other info / logs

Hello, have you solved this issue? i also have the same error when initializing tea.seq2seq.LuongAttention.

bhack commented

Do you have a very minimal gist to run to reproduce this?

@We-here unfortunately no, I have not found a way to bypass type check

@bhack I think the code above is enough. Exception is thrown when initializing the instance of the class, not afterwards

bhack commented

Yes cause _process_probability_fn was going to transform str back to the function.

Can you send a PR to the specific test to help reproduce this case?

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/seq2seq/tests/attention_wrapper_test.py

@seanpmorgan

Also seeing this issue on google colab.

  • Python version: 3.9
  • Tensorflow version: 2.11.0
  • Tensorflow-addons version: 0.19.0
  • Is GPU used? (yes/no): yes

@BrandonStudio I skip the type checking by comment out @typechecked at AttentionMechanism and its derived classes

@DangMinh24 Did you just modify the library source code?

@BrandonStudio Yeah, I modify tensorflow-addons code in local environment.