/handshape-recognition

Primary LanguageJupyter NotebookMIT LicenseMIT

Handshape Recognition

Content

Results

In the /results directory you can find the results of a training processes using a <model> on a specific <dataset>:

.
├─ . . .
├─ results
│  ├─ <dataset>                            # results for an specific dataset.
│  │  ├─ <model>                           # results training a <model> on a <dataset>.
│  │  │  ├─ models                         # ".h5" files for trained models.
│  │  │  ├─ results                        # ".csv" files with the different metrics for each training period.
│  │  │  ├─ summaries                      # tensorboard summaries.
│  │  │  ├─ config                         # optional configuration files.
│  └─ └─ └─ <dataset>_<model>_results.csv  # ".csv" file in which the relationships between configurations, models, results and summaries are listed by date.
└─ . . .

where

<dataset> = lsa16 | rwth | . . .
<model> = dense-net | proto-net

To run TensorBoard, use the following command:

$ tensorboard --logdir=./results/<dataset>/<model>/summaries

Quickstart

$ ./bin/start [-n <string>] [-t <tag-name>] [--sudo] [--build]
<tag-name> = cpu | devel-cpu | gpu

Setup and use docker

Build the docker image,

$ docker build --rm -f dockerfiles/tf-py3-jupiter.Dockerfile -t handshape-recognition:latest .

and now run the image

$ docker run --rm -u $(id -u):$(id -g) -p 6006:6006 -p 8888:8888 handshape-recognition:latest

Visit that link, hey look your jupyter notebooks are ready to be created.

If you want, you can attach a shell to the running container

$ docker exec -it <container-id> /bin/sh -c "[ -e /bin/bash ] && /bin/bash || /bin/sh"

And then you can find the entire source code in /develop.

$ cd /develop

To run TensorBoard, use the following command (alternatively python -m tensorboard.main)

$ tensorboard --logdir=/path/to/summaries

Models

Prototypical Networks for Few-shot Learning

Tensorflow v2 implementation of NIPS 2017 Paper Prototypical Networks for Few-shot Learning.

Implementation based on protonet.

Run the following command to run training on <config> with default parameters.

$ ./bin/protonet --mode train --config <config>

<config> = lsa16 | rwth

Evaluating

To run evaluation on a specific dataset

$ ./bin/protonet --mode eval --config <config>

<config> = lsa16 | rwth, rwth not working yet

Dense Net

Tensorflow 2 implementation of Densenet using Squeeze and Excitation layers.

Inspired by flyyufelix keras implementation (https://github.com/flyyufelix/DenseNet-Keras).

For more information about densenet please refer to the original paper (https://arxiv.org/abs/1608.06993).

To train run the following command

$ python train_single.py

you can include the following arguments for further customization

Dataset:

--dataset=<dataset>

<dataset> = lsa16 | rwth

Rotation angle in degrees:

--rotation=<int>

Widht shift:

--w-shift=<float>

Height shift:

--h-shift=<float>

Horizontal flip:

--h-flip=<boolean>

Densenet's growth rate:

--growth-r=<int>

Densenet's number of dense layers:

--nb-layers=<nb-layers>

<nb-layers> = <int>[:<int>]*

Densenet's reduction:

--reduction=<float>

Learning rate:

--lr=<float>

Epochs:

--epochs=<int>

Maximum patience:

--patience=<int>

Log frequency:

--log-freq=<int>

Save frequency (only works if checkpoints is set to True):

--save-freq=<int>

Models directory (only works if checkpoints is set to True):

--models-dir=<string>

Results directory:

--results_dir=<string>

Checkpoint model saving:

--checkpoints=<boolean>

Use of class weights:

--weight_classes=<boolean>