HIT-SCIR/ELMoForManyLangs

Getting all layers

Nuccy90 opened this issue · 6 comments

Hi,

I was wondering whether there is a simple way to modify your code to return the 3 layers of the biLSTM, as in Peters et al. (2018), so as to train the task specific weights for the weighted average.

Thank you

It looks like it would be pretty easy to modify sents2elmo in elmo.py here:

payload = data[output_layer]

I just made (and tested) the change, it's in my fork: https://github.com/strubell/ELMoForManyLangs

running e.sents2elmo(sents, output_layer=-2) will return all three embeddings for each token.

Happy to submit a PR if the owners are interested.

@strubell Thanks! a PR will be welcome!

Submitted. PR #39 should resolve this issue.

@strubell Great! many thanks! Hopefully, this issue is resolved and I will close it.