tweet-sentiment-extraction_robert-base
Utilizing Robert-base to Extract support phrases for sentiment labels
Selecting the part of the tweet (word or phrase) that reflects sentiment for the tweet
robert-base
Pretrained model on Romanian language using a masked language modeling (MLM) and next sentence prediction (NSP) objective.
Model | Weights | L | H | A | MLM accuracy | NSP accuracy |
---|---|---|---|---|---|---|
RoBERT-base | 114M | 12 | 768 | 12 | 0.6511 | 0.9802 |
References
- https://huggingface.co/readerbench/RoBERT-base
- https://www.kaggle.com/abhishek/roberta-inference-5-folds/data
- https://www.kaggle.com/shoheiazuma/tweet-sentiment-roberta-pytorch
- https://www.kaggle.com/aditidutta/tweet-sentiment-extraction-pytorch
- https://brunch.co.kr/@choseunghyek/7
- https://vanche.github.io/spanbert_roberta/
- https://www.kaggle.com/datasets/abhishek/roberta-base