/EchoRob

Recurrent Neural Network for Syntax Learning with Flexible Representations for Robotic Architectures

Primary LanguageRich Text FormatOtherNOASSERTION

EchoRob

Recurrent Neural Network for Syntax Learning with Flexible Representations for Robotic Architectures

Want to be updated?

Drop me an email if you want to be updated by new features, corpa or versions of the model are available: xavier dot hinaut at inria dot fr

Intro on Reservoir Computing

Reservoir Computing (RC) is a type of Recurrent Neural Network used in this EchoRob project. Namely we use an instance of RC called Echo State Networks (ESN).

Here you can find the ReservoirPy repository with a simple ESN class that can be easily used for many applications.

Romain Pastureau (intern in our Mnemosyne team; summer 2016) did great iPython Notebook tutorials:

https://github.com/neuronalX/Reservoir-Jupyter

Click on the "launch Binder" button, and then run either:

Corpora

Most copora are available in the folder /corpora.

15 languages corpora is available here:

2019_TCDS_15languages.txt

Bilingual corpus on English and French obtain by asking 5 users of each language to give command instructions to robots:

These corpora were used in the CoCo (Cogitive Computation) @ NIPS 2015 workshop:

Subjects were asked to provide sentences in English or in French for each of the 38 videos provided. The videos are available here.

Older corpora (for PLoS ONE 2013 paper) used with different versions of the model are available here:

Source code

ROS module

(soon here ...)

PLoS ONE (2013)

xavierhinaut.com/downloads

(Other codes available soon ...)

Videos

References

More information about the syntax learning model and related papers. Most recent papers could be seen on my Research Gate or Google Scholar profiles.

Hinaut & Dominey (2013) PLoS ONE

Most detailed about general model features and background of the approach. Paper open access

Hinaut et al. (2014) Frontiers in NeuroRobotics

Implementation of the model in the iCub humanoid robot and data collection with naive users. Presentation of the "reverse" version of the model producing sentences based on meaning representation. Paper open access

Hinaut et al. (2015) CoCo (Cognitive Computation) @ NIPS workshop

Syntax model generalize on English and French at the same time using the same random reservoir (hidden layer). Model enhanced with the ability to process "unwkown/infrequent words". Paper on ResearchGate

Twiefel, Hinaut, Wermter (2016) RO-MAN

Integration of the syntax model in an HRI experiment using the Nao humanoid robot Paper on ResearchGate

Twiefel et al. (2016) ESANN

Adaptation of the syntax model to a crowdsourced (noisy) corpus data of robot arm commands Paper on ResearchGate

Hinaut (2018) ICDL

Testing the (syntax) model on 3 levels of abstractions: syntactic constructions as before, but also sequence of words and sequences of phonemes. Results indicate that ESNs are able to generalize on all of them even if the corpus is noisy and with OOV (Out-of-Vocabulary) words. Github repo and paper

Hinaut & Spranger (2019) ICDL

Juven & Hinaut (2020) IJCNN

Pedrelli & Hinaut (2020) IJCNN

Dinh & Hinaut (2020) ICDL