Getting all layers
Nuccy90 opened this issue · 6 comments
Nuccy90 commented
Hi,
I was wondering whether there is a simple way to modify your code to return the 3 layers of the biLSTM, as in Peters et al. (2018), so as to train the task specific weights for the weighted average.
Thank you
strubell commented
It looks like it would be pretty easy to modify sents2elmo
in elmo.py
here:
ELMoForManyLangs/elmoformanylangs/elmo.py
Line 204 in 3ca013f
strubell commented
I just made (and tested) the change, it's in my fork: https://github.com/strubell/ELMoForManyLangs
running e.sents2elmo(sents, output_layer=-2)
will return all three embeddings for each token.
Happy to submit a PR if the owners are interested.
Oneplus commented
@strubell Great! many thanks! Hopefully, this issue is resolved and I will close it.
strubell commented
Great! Thanks for releasing these models and code, they're a wonderful
multilingual resource!
…On Tue, Jan 22, 2019 at 11:46 AM Yijia Liu ***@***.***> wrote:
@strubell <https://github.com/strubell> Great! many thanks! Hopefully,
this issue is resolved and I will close it.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#31 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADHZt41Ti1CzE0ILc0AxFB9Io2xBPphmks5vF0BggaJpZM4YgCU0>
.