/knowledge-distillation-seq

This repository implements sequence-level knowledge distillation (Kim and Rush 2016)

Primary LanguageJupyter Notebook

knowledge-distillation-seq

This repository implements sequence-level knowledge distillation (Kim and Rush 2016)

Quick Start:

  1. Installation
git clone https://github.com/OpenNMT/OpenNMT-py
cd OpenNMT-py
pip install -e .
mkdir th-en/run
  1. Download dataset
  1. Run Create three .yaml files inside OpenNMT directory for teacher, student and distilled networks, where the first two have a dataset path generated from sentencepiece tokenizer and distilled config file has a dataset path generte from trained teacher network. Instead of shell file I prefer notebook format, so to train all three networks run the following.
$ run OpenNMT-py.ipynb