/text_classifier

Natural Language Processing - Tokenization (NLP Zero to Hero) conversion to tfjs

Primary LanguageJavaScriptApache License 2.0Apache-2.0

text_classifier

Natural Language Processing - Tokenization (NLP Zero to Hero) conversion to tfjs

Related to Tokenization (NLP Zero to Hero) - Part 1, Part 2 and Part 3 by Laurence. Youtube link

The code consists of 87 lines, including annotations that clarify the process flow. These annotations greatly enhance the code's readability, especially for beginners in machine learning.

npm install
node sarc.js

The original Google colab is here

__________________________________________________________________________________________
Layer (type)                Input Shape               Output shape              Param #   
==========================================================================================
embedding_Embedding1 (Embed [[null,100]]              [null,100,16]             160000
__________________________________________________________________________________________
global_average_pooling1d_Gl [[null,100,16]]           [null,16]                 0
__________________________________________________________________________________________
dense_Dense1 (Dense)        [[null,16]]               [null,24]                 408
__________________________________________________________________________________________
dense_Dense2 (Dense)        [[null,24]]               [null,1]                  25
==========================================================================================
Total params: 160433
Trainable params: 160433
Non-trainable params: 0
__________________________________________________________________________________________
Epoch 1 / 30
eta=0.0 =================================================================================>
6441ms 322us/step - acc=0.571 loss=0.677 val_acc=0.709 val_loss=0.619
Epoch 2 / 30
eta=0.0 =================================================================================>
5333ms 267us/step - acc=0.818 loss=0.454 val_acc=0.837 val_loss=0.390
Epoch 3 / 30
eta=0.0 =================================================================================>
5455ms 273us/step - acc=0.872 loss=0.323 val_acc=0.853 val_loss=0.355
Epoch 4 / 30
eta=0.0 =================================================================================>
5663ms 283us/step - acc=0.893 loss=0.273 val_acc=0.847 val_loss=0.351
Epoch 5 / 30
eta=0.0 =================================================================================>
5614ms 281us/step - acc=0.908 loss=0.238 val_acc=0.852 val_loss=0.345
Epoch 6 / 30
eta=0.0 =================================================================================>
5632ms 282us/step - acc=0.919 loss=0.211 val_acc=0.845 val_loss=0.357
Epoch 7 / 30
eta=0.0 =================================================================================>
5668ms 283us/step - acc=0.930 loss=0.188 val_acc=0.848 val_loss=0.358
Epoch 8 / 30
eta=0.0 =================================================================================>
5569ms 278us/step - acc=0.939 loss=0.169 val_acc=0.851 val_loss=0.366
Epoch 9 / 30
eta=0.0 =================================================================================>
5601ms 280us/step - acc=0.944 loss=0.154 val_acc=0.840 val_loss=0.404
Epoch 10 / 30
eta=0.0 =================================================================================>
5572ms 279us/step - acc=0.951 loss=0.140 val_acc=0.849 val_loss=0.396
Epoch 11 / 30
eta=0.0 =================================================================================>
5887ms 294us/step - acc=0.956 loss=0.128 val_acc=0.847 val_loss=0.415
Epoch 12 / 30
eta=0.0 =================================================================================>
5656ms 283us/step - acc=0.960 loss=0.117 val_acc=0.842 val_loss=0.442
Epoch 13 / 30
eta=0.0 =================================================================================>
5513ms 276us/step - acc=0.965 loss=0.107 val_acc=0.844 val_loss=0.452
Epoch 14 / 30
eta=0.0 =================================================================================>
5535ms 277us/step - acc=0.968 loss=0.0996 val_acc=0.841 val_loss=0.486
Epoch 15 / 30
eta=0.0 =================================================================================>
5574ms 279us/step - acc=0.971 loss=0.0909 val_acc=0.841 val_loss=0.497
Epoch 16 / 30
eta=0.0 =================================================================================>
5463ms 273us/step - acc=0.974 loss=0.0845 val_acc=0.839 val_loss=0.522
Epoch 17 / 30
eta=0.0 =================================================================================>
5468ms 273us/step - acc=0.975 loss=0.0798 val_acc=0.828 val_loss=0.563
Epoch 18 / 30
eta=0.0 =================================================================================>
5509ms 275us/step - acc=0.978 loss=0.0734 val_acc=0.835 val_loss=0.566
Epoch 19 / 30
eta=0.0 =================================================================================>
5501ms 275us/step - acc=0.980 loss=0.0686 val_acc=0.834 val_loss=0.598
Epoch 20 / 30
eta=0.0 =================================================================================>
5592ms 280us/step - acc=0.981 loss=0.0629 val_acc=0.833 val_loss=0.615
Epoch 21 / 30
eta=0.0 =================================================================================>
5568ms 278us/step - acc=0.982 loss=0.0596 val_acc=0.831 val_loss=0.654
Epoch 22 / 30
eta=0.0 =================================================================================>
5477ms 274us/step - acc=0.985 loss=0.0556 val_acc=0.825 val_loss=0.709
Epoch 23 / 30
eta=0.0 =================================================================================>
5566ms 278us/step - acc=0.985 loss=0.0519 val_acc=0.828 val_loss=0.695
Epoch 24 / 30
eta=0.0 =================================================================================>
5611ms 281us/step - acc=0.985 loss=0.0486 val_acc=0.823 val_loss=0.730
Epoch 25 / 30
eta=0.0 =================================================================================>
5481ms 274us/step - acc=0.988 loss=0.0440 val_acc=0.826 val_loss=0.749
Epoch 26 / 30
eta=0.0 =================================================================================>
5521ms 276us/step - acc=0.988 loss=0.0415 val_acc=0.825 val_loss=0.782
Epoch 27 / 30
eta=0.0 =================================================================================>
5558ms 278us/step - acc=0.990 loss=0.0378 val_acc=0.824 val_loss=0.804
Epoch 28 / 30
eta=0.0 =================================================================================>
5503ms 275us/step - acc=0.991 loss=0.0360 val_acc=0.820 val_loss=0.862
Epoch 29 / 30
eta=0.0 =================================================================================>
5480ms 274us/step - acc=0.991 loss=0.0331 val_acc=0.819 val_loss=0.904
Epoch 30 / 30
eta=0.0 =================================================================================>
5439ms 272us/step - acc=0.993 loss=0.0308 val_acc=0.819 val_loss=0.903

0.8131120204925537 8.131120204925537e-1

0.0005136687541380525 5.136687541380525e-4