We propose a fully imaginary keyboard (I-Keyboard) with a deep neural decoder (DND). Below are a few features of I-Keyboard.
- The eyes-free ten-finger typing scenario of I-Keyboard does not necessitate both a calibration step and a pre-defined region for typing (first explored in this study!).
- The invisibility of I-Keyboard maximizes the usability of mobile devices.
- DND empowered by a deep neural architecture allows users to start typing from any position on the touch screens at any angle.
- We collected the largest user data in the process of developing I-Keyboard and make the data public!
- I-Keyboard showed 18.95% and 4.06% increases in typing speed (45.57 WPM) and accuracy (95.84%), respectively over the baseline.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- MS Visual Studio >= 2015
- CUDA >= 9.0
- Python 3.6+
- Tensorflow >= 1.14
- File Name: We have two types of file names.
- D_A_S_I_T_P
- D: date (YYYYMMDD).
- A: age.
- S: sex (male or female).
- I: initial for discrimination.
- T: typing speed on physical keyboard.
- P: palm attached or detached while typing.
- e.g. 20190117_24_male_lhk_200_x
- D_A_S_M_T
- M: major for discrimination.
- e.g. 20180831_26_male_enginerring_140
- Data Format
- One or two line(s) of phrases.
- When an enter is involved, two lines appear.
- The two phrases are separated by the enter key.
- The sequence of x touch positions.
- The sequence of y touch positions.
To visualize the user behavior and analyze the statistics of user behavior, run the below. Running below will create two directories ('list_data' and 'figs') and save both the preprocessed results and the figures.
cd user_behavior_analysis
# preprocess the raw_data and save the result in the 'list_data' directory
python3 preprocessing.py
# visualize each experiment participant's typing behavior
# and extracts the statistics over the whole participants
python3 user_analysis.py
For training, you need to set up a conda environment (recommended), or you can use pip instead
conda cread --name ikeyboard python=3.6
conda activate ikeyboard
conda install -c conda-forge tensorflow-gpu=1.14 editdistance
Then, you need to prepare data records using the "data.py" script as follows
python data.py
Executing the above script will generate a set of .tfrecords and .pkl objects required for training and testing.
Finally, you can train and test the propose DND model as follows
python train.py --name experiment_name
python test_experiment.py
For training options and test options, refer to "train_script.py" and "test_script.py".
- The paper has been accepted! (IEEE Trans. on Cybernetics)
- Any comments are welcome
- Thank you for your attention
Please consider citing this project in your publications if you find this helpful. The following is the BibTeX.
@article{kim2019keyboard,
title={I-Keyboard: Fully Imaginary Keyboard on Touch Devices Empowered by Deep Neural Decoder},
author={Kim, Ue-Hwan and Yoo, Sahng-Min and Kim, Jong-Hwan},
journal={IEEE Transactions on Cybernetics},
year={2019}
}
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. NRF-2017R1A2A1A17069837).