SuwaidAslam/AI_Generated_Text_Checker_App

Not start

Closed this issue · 1 comments

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Traceback (most recent call last):
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/index.py", line 16, in
AppCallback(app)
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/AppCallback.py", line 51, in init
tokenizer = AutoTokenizer.from_pretrained(full_path, local_files_only=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/env/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 659, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/env/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1801, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/env/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1831, in _from_pretrained
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/env/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1956, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kali/Desktop/AI_Generated_Text_Checker_App/env/lib/python3.11/site-packages/transformers/models/roberta/tokenization_roberta.py", line 226, in init
with open(vocab_file, encoding="utf-8") as vocab_handle:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not NoneType

This was an issue with the model files not being present in the roberta-base-openai-detector directory. Just download the file from the link below and unzip it and replace it with the roberta-base-openai-detector folder.
https://drive.google.com/file/d/16Be7V62ew3PVf5pbxNOr_uLB1CTGpYg-/view?usp=sharing