'TFEmbeddings' object has no attribute 'word_embeddings'
hossein-amirkhani opened this issue · 2 comments
hossein-amirkhani commented
Trying to run this example, I encountered 'TFEmbeddings' object has no attribute 'word_embeddings' error. Any help is appreciated.
dr-smgad commented
Try to use '.weight' instead of .word_embeddings as per hugging face latest implementation:
`class TFBertEmbeddings(tf.keras.layers.Layer):
"""Construct the embeddings from word, position and token_type embeddings."""
def __init__(self, config: BertConfig, **kwargs):
super().__init__(**kwargs)
self.vocab_size = config.vocab_size
self.type_vocab_size = config.type_vocab_size
self.hidden_size = config.hidden_size
self.max_position_embeddings = config.max_position_embeddings
self.initializer_range = config.initializer_range
self.embeddings_sum = tf.keras.layers.Add()
self.LayerNorm = tf.keras.layers.LayerNormalization(epsilon=config.layer_norm_eps, name="LayerNorm")
self.dropout = tf.keras.layers.Dropout(rate=config.hidden_dropout_prob)
def build(self, input_shape: tf.TensorShape):
**with tf.name_scope("word_embeddings"):
self.weight = self.add_weight**(
name="weight",
shape=[self.vocab_size, self.hidden_size],
initializer=get_initializer(self.initializer_range),
)`
hossein-amirkhani commented
Thanks. Any point on the new raised error: "TypeError: Cannot convert 'logits' to EagerTensor of dtype float"?