tensorflow/text

Transformers Notebook does not work on Google Colab

Opened this issue · 5 comments

Gets stuck on this block of code

`embed_pt = PositionalEmbedding(vocab_size=tokenizers.pt.get_vocab_size(), d_model=512)
embed_en = PositionalEmbedding(vocab_size=tokenizers.en.get_vocab_size(), d_model=512)

pt_emb = embed_pt(pt)
en_emb = embed_en(en`

Hi,
adding .numpy() after the get_vocab_size() functions will solve your issue!

embed_pt = PositionalEmbedding(vocab_size=tokenizers.pt.get_vocab_size().numpy(), d_model=512)
embed_en = PositionalEmbedding(vocab_size=tokenizers.en.get_vocab_size().numpy(), d_model=512)

I got error in the importing parts:


AttributeError Traceback (most recent call last)
in <cell line: 26>()
24
25 import tensorflow as tf
---> 26 import tensorflow_hub as hub
27 import tensorflow_text as text
28 import tensorflow_datasets as tfds

13 frames
/usr/local/lib/python3.10/dist-packages/tf_keras/src/saving/legacy/saved_model/load_context.py in
66
67
---> 68 tf.internal.register_load_context_function(in_load_context)
69

AttributeError: module 'tensorflow._api.v2.compat.v2.internal' has no attribute 'register_load_context_function'

I got the same problem on May 1, 2024. I am struggling to understand why these tutorials are not tested before releasing code updates. IMHO these frequent issues create a sense of frustration and loss of faith in the tool.

Adding .numpy() worked for me, posting in the hopes that the notebook can be corrected and the next person can save themselves the hour I wasted trying to fix myself.