The papers were implemented in using korean corpus
- Using the Naver sentiment movie corpus v1.0
- Hyper-parameter was arbitrarily selected. (epoch: 5, mini_batch: 128, except KoBERT (epoch: 2, mini_batch: 32))
Train ACC (120,000) | Validation ACC (30,000) | Test ACC (50,000) | date | |
---|---|---|---|---|
SenCNN | 91.98% | 86.76% | 86.14% | 190918 |
CharCNN | 86.54% | 82.25% | 81.89% | 190918 |
ConvRec | 88.94% | 83.90% | 83.64% | 190918 |
VDCNN | 87.05% | 84.42% | 84.09% | 190918 |
SAN | 91.17% | 86.54% | 86.00% | 190919 |
KoBERT | 94.47% | 89.85% | 89.60% |
- Convolutional Neural Networks for Sentence Classification (as SenCNN)
- Character-level Convolutional Networks for Text Classification (as CharCNN)
- Efficient Character-level Document Classification by Combining Convolution and Recurrent Layers (as ConvRec)
- Very Deep Convolutional Networks for Text Classification (as VDCNN)
- A Structured Self-attentive Sentence Embedding (as SAN)
- BERT_single_sentence_classification (as KoBERT)
- Creating dataset from https://github.com/songys/Question_pair
- Hyper-parameter was arbitrarily selected. (epoch: 5, mini_batch: 64)
Train ACC (6,060) | Validation ACC (1,516) | |
---|---|---|
SAN | 91.93% | 81.46% |
KoBERT | 90.80% | 93.33% |
- A Structured Self-attentive Sentence Embedding (as SAN)
- Siamese recurrent architectures for learning sentence similarity
- Stochastic Answer Networks for Natural Language Inference
- BERT_pairwise_text_classification (as KoBERT)
- Character-Aware Neural Language Models
- Using the Naver nlp-challange corpus for NER
- Hyper-parameter was arbitrarily selected.
- Bidirectional LSTM-CRF Models for Sequence Tagging
- End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
- Neural Architectures for Named Entity Recognition
- BERT_single_sentence_tagging
- Effective Approaches to Attention-based Neural Machine Translation
- Attention Is All You Need
- Machine Comprehension Using Match-LSTM and Answer Pointer
- Bi-directional attention flow for machine comprehension
- BERT_question_answering