Summary of training/finetuning/prediction commands
NiteshMethani opened this issue · 2 comments
NiteshMethani commented
Hi,
Really impressive work and a nice repository combining all the NER models.
I was trying to comprehend the codebase and what all are its capabilities. To that end, I made a list of NER-baselines and a corresponding command. Can authors of this repo @allanj @furkan-celik @yuchenlin help me complete this list?
Model | Embeddings | Sample Command | Notes |
---|---|---|---|
BiLSTM | Random | ||
BiLSTM + CharCNN | Random | NA | |
BiLSTM + CharLSTM | Random | python trainer.py --use_crf_rnn 0 | |
BiLSTM + CharCNN + CRF | Random | NA | |
BiLSTM + CharLSTM + CRF | Random | python trainer.py --use_crf_rnn 1 | |
BiLSTM + CharLSTM + CRF | FastText | ||
BiLSTM + CharLSTM + CRF | static embedding from ELMo | ||
BiLSTM + CharLSTM + CRF | static embedding from BERT | ||
BiLSTM + CharLSTM + CRF | contextual embedding from ELMo | ||
BiLSTM + CharLSTM + CRF | contextual embedding from BERT | ||
Default bert-base-uncased | - | ||
Default bert-large-uncased | - | ||
Finetuned bert-base-uncased | - | ||
Finetuned bert-base-uncased | concatenated with pretrained FastText embedding | ||
Default roberta-base | - | ||
Finetuned roberta-base | - | ||
Finetuned roberta-base | concatenated with pretrained FastText embedding |
I understand some of these configurations might not be supported and there might be additional configurations which this repo supports. So feel free to add/modify the above table. I feel such a one-stop table will help the community and could be a contribution towards documentation.
-Nitesh
allanj commented
Thanks, Nitesh. I will try to do so time by time. Please allows some time for this
NiteshMethani commented
Sure, thanks!