/MNIST_AttentionMap

Attention mechanism with MNIST dataset

Primary LanguagePythonMIT LicenseMIT

[TensorFlow] Attention mechanism with MNIST dataset

Usage

$ python run.py

Result

Training

Loss graph.

Test






Each figure shows input digit, attention map, and overlapped image sequentially.

Further usage




The further usages. Detecting the location of digits can be conducted using an attention map.

Requirements

  • TensorFlow 2.3.0
  • Numpy 1.18.5

Additional Resources

[1] Simple attention mechanism test by Myung Jin Kim