PEEP-Talk: Deep Learning-based English Education Platform for Personalized Foreign Language Learning
Human-inspired AI, Korea University
PEEP-Talk is an educational platform with a deep learning-based persona conversation system and a feedback function for correcting English grammar. In addition, unlike the existing persona conversation system, a Context Detector (CD) module that can automatically determine the flow of conversation and change the conversation topic in real time can be applied to give the user a feeling of talking to a real person.
The source code is open so that you can download the source code and set it up with ease if you would like to have your own exclusive environment, and this platform is deployed by Kakao i Open Builder.
screenshot1 | screenshot2 |
---|---|
By considering persona as a situation, English conversation learning for each situation becomes possible. To make conversational agent model mainly, we use Hugging Face's TransferTransfo code.
This module can detect whether user speak properly in suggested situation or not. This module contains two BERT based models. Evaluate the conversation using the following two functions. Based on that score, we decide whether to change the conversation.
- Context Similarity(상황 유사도): fine-tuinig the MRPC(Microsoft Research Paraphrase Corpus) dataset to detect user's context similarity in suggested situation.
- Linguistic Acceptability(문장 허용도): fine-tuning the CoLA(The Corpus of Linguistic Acceptability) dataset to detect user's input is acceptable in human conversation.
To give grammar feedback to english learner, We use GEC(Grammar Error Correction) as REST API.
.
├── data_preprocessing # data preprocess
├── alf_test.py # experiment for context detector
├── app.py # REST API code
├── kakao.py # REST API code for Kakao Channel
├── run.py # running PEEP-Talk
├── requirements.txt
├── LICENSE
└── README.md
Model | Epoch | Batch size | Learning rate | Sequence length |
---|---|---|---|---|
BERT | 5 | 16 | 2e-05 | 256 |
ALBERT | 5 | 32 | 2e-05 | 128 |
RoBERTa | 5 | 16 | 3e-05 | 256 |
XLNet | 5 | 32 | 5e-05 | 256 |
Model | MRPC |
---|---|
BERT | 0.876 |
ALBERT | 0.884 |
RoBERTa | 0.923 |
XLNet | 0.928 |
CoLA | ||
---|---|---|
Model | Validation | Test |
BERT | 0.812 | 0.820 |
ALBERT | 0.728 | 0.736 |
RoBERTa | 0.739 | 0.755 |
XLNet | 0.851 | 0.870 |
to interact with PEEP-talk :
python run.py
to kakao server:
python kakao.py
This project got Best Paper Award from HCTL(Human & Cognitive Language Technology: 한글 및 한국어 정보처리 학술대회)
@inproceedings{lee2021peep,
title={PEEP-Talk: Deep Learning-based English Education Platform for Personalized Foreign Language Learning},
author={Lee, SeungJun and Jang, Yoonna and Park, Chanjun and Kim, Minwoo and Yahya, Bernardo N and Lim, Heuiseok},
booktitle={Annual Conference on Human and Language Technology},
pages={293--299},
year={2021},
organization={Human and Language Technology}
}
The MIT License