/gtn_applications

Applications using the GTN library and code to reproduce experiments in "Differentiable Weighted Finite-State Transducers"

Primary LanguagePythonMIT LicenseMIT

gtn_applications

An applications library using GTN. Current examples include:

  • Offline handwriting recognition
  • Automatic speech recognition

Installing

  1. Build python bindings for the GTN library.

  2. conda activate gtn_env # using the same environment from Step 1

  3. conda install pytorch torchvision -c pytorch

  4. pip install -r requirements.txt

Training

We give an example of how to train on the IAM off-line handwriting recognition benchmark. Open In Colab

First register here and download the dataset:

./datasets/download/iamdb.sh <path_to_data> <email> <password>

Then update the configuration JSON configs/iamdb/tds2d.json to point to the data path used above:

  "data" : {
    "dataset" : "iamdb",
    "data_path" : "<path_to_data>",
    "num_features" : 64
  },

Single GPU training can be run with:

python train.py --config configs/iamdb/tds2d.json

To run distributed training with multiple GPUs:

python train.py --config configs/iamdb/tds2d.json --world_size <NUM_GPUS>

For a list of options type:

python train.py -h

Contributing

Use Black to format python code.

First install:

pip install black

Then run with:

black <file>.py

License

GTN is licensed under a MIT license. See LICENSE.