classify_text_with_bert does not ALWAYS use GPU (Kaggle)
rbhambriiit opened this issue · 3 comments
Hi,
I tried running the tutorial:
https://www.tensorflow.org/text/tutorials/classify_text_with_bert
over kaggle.
please take a look at my notebook: https://www.kaggle.com/rbhambri/bert-classifier-imdb-20newsgroup
Observation:
kernel is not able to use gpu, when i specify
bert_model_name = 'bert_en_uncased'
and training time looks really crazy on cpu.
It is working fine in the given document with small-bert model
Thanks!
do the other models fail to run on gpu too, or is bert_en_uncased the only one?
Update:
I used an older version of tensorflow-text library.
! pip install -U "tensorflow-text==2.6.*"
I could see the gpu in action for all models. You might want to check the latest version 2.8*
Else, feel free to close this issue.
Thanks!