Clean up the embeddings API and parameter passing
matt-gardner opened this issue · 0 comments
matt-gardner commented
Currently, there are keys around pretrained embeddings, projecting the embeddings, dropout, and so on, that are flat parameters to TextTrainer
. There's also an embedding_dim
parameter, which is a dict, with arbitrary allowed keys. We should make the flat parameters also a part of this dictionary, so the parameters look something like this:
"embeddings": {
"words": {
"dim": 100,
"pretrained_file": "/path/to/glove",
"fine_tune": false
},
"characters": {
"dim": 16
}
}