Error in Accessing Layers of Word Embedding model
Anacoder1 opened this issue · 1 comments
Anacoder1 commented
In this code, in the "Get word embeddings" section of the "Skip-gram model", the code is as follows:
merge_layer = model.layers[0]
word_model = merge_layer.layers[0]
word_embed_layer = word_model.layers[0]
weights = word_embed_layer.get_weights()[0][1:]
The above code gives the error as
AttributeError: 'InputLayer' object has no attribute 'layers'
The following code should be inserted in place of the above code (it works perfectly):
word_embed_layer = model.layers[2]
weights = word_embed_layer.get_weights()[0][1:]
Sinchit commented
Thanks