patil-suraj/question_generation

How would you fix these issues to get the project running

eQX8HNMTU2 opened this issue · 1 comments

i tried installing rust (conda version) and microsoft visual cpp build tools, and its still not working. this is the error im getting. Any help is appreciated. Do i need to lower my python version or smth?

Output exceeds the [size limit](command:workbench.action.openSettings?%5B%22notebook.output.textLineLimit%22%5D). Open the full output data [in a text editor](command:workbench.action.openLargeOutput?1f520a89-4d9e-46ca-9d17-764dad2354a8)Collecting transformers==3.0.0
  Using cached transformers-3.0.0-py3-none-any.whl (754 kB)
Collecting numpy
  Using cached numpy-1.24.3-cp311-cp311-win_amd64.whl (14.8 MB)
Collecting tokenizers==0.8.0-rc4
  Using cached tokenizers-0.8.0rc4.tar.gz (96 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: packaging in [c:\users\andih\.conda\envs\patil-qg\lib\site-packages](file:///C:/users/andih/.conda/envs/patil-qg/lib/site-packages) (from transformers==3.0.0) (23.1)
Collecting filelock
  Using cached filelock-3.12.0-py3-none-any.whl (10 kB)
Collecting requests
  Using cached requests-2.28.2-py3-none-any.whl (62 kB)
Collecting tqdm>=4.27
  Using cached tqdm-4.65.0-py3-none-any.whl (77 kB)
Collecting regex!=2019.12.17
  Using cached regex-2023.3.23-cp311-cp311-win_amd64.whl (267 kB)
Collecting sentencepiece
  Using cached sentencepiece-0.1.98-cp311-cp311-win_amd64.whl (977 kB)
Collecting sacremoses
  Using cached sacremoses-0.0.53-py3-none-any.whl
...
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml): started
  Building wheel for tokenizers (pyproject.toml): finished with status 'error'
Failed to build tokenizers
Output exceeds the [size limit](command:workbench.action.openSettings?%5B%22notebook.output.textLineLimit%22%5D). Open the full output data [in a text editor](command:workbench.action.openLargeOutput?28143f19-d92d-4d56-a090-f018d65498ee)  error: subprocess-exited-with-error
  
  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [48 lines of output]
      [C:\Users\andih\AppData\Local\Temp\pip-build-env-i3myygk_\overlay\Lib\site-packages\setuptools\dist.py:519](file:///C:/Users/andih/AppData/Local/Temp/pip-build-env-i3myygk_/overlay/Lib/site-packages/setuptools/dist.py:519): InformationOnly: Normalizing '0.8.0.rc4' to '0.8.0rc4'
        self.metadata.version = self._normalize_version(
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-311
      creating build\lib.win-amd64-cpython-311\tokenizers
      copying tokenizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers
      creating build\lib.win-amd64-cpython-311\tokenizers\models
      copying tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\models
      creating build\lib.win-amd64-cpython-311\tokenizers\decoders
      copying tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\decoders
      creating build\lib.win-amd64-cpython-311\tokenizers\normalizers
      copying tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\normalizers
      creating build\lib.win-amd64-cpython-311\tokenizers\pre_tokenizers
      copying tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\pre_tokenizers
      creating build\lib.win-amd64-cpython-311\tokenizers\processors
      copying tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\processors
      creating build\lib.win-amd64-cpython-311\tokenizers\trainers
...
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

i even tried to install rust in google collab and its not working

Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting transformers==3.0.0
  Using cached transformers-3.0.0-py3-none-any.whl (754 kB)
Requirement already satisfied: requests in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (2.27.1)
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (2022.10.31)
Collecting tokenizers==0.8.0-rc4
  Using cached tokenizers-0.8.0rc4.tar.gz (96 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: filelock in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (3.11.0)
Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (4.65.0)
Requirement already satisfied: numpy in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (1.22.4)
Collecting sacremoses
  Using cached sacremoses-0.0.53-py3-none-any.whl
Collecting sentencepiece
  Using cached sentencepiece-0.1.98-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
Requirement already satisfied: packaging in /usr/local/lib/python3.9/dist-packages (from transformers==3.0.0) (23.1)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.9/dist-packages (from requests->transformers==3.0.0) (3.4)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.9/dist-packages (from requests->transformers==3.0.0) (2.0.12)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.9/dist-packages (from requests->transformers==3.0.0) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.9/dist-packages (from requests->transformers==3.0.0) (2022.12.7)
Requirement already satisfied: click in /usr/local/lib/python3.9/dist-packages (from sacremoses->transformers==3.0.0) (8.1.3)
Requirement already satisfied: six in /usr/local/lib/python3.9/dist-packages (from sacremoses->transformers==3.0.0) (1.16.0)
Requirement already satisfied: joblib in /usr/local/lib/python3.9/dist-packages (from sacremoses->transformers==3.0.0) (1.2.0)
Building wheels for collected packages: tokenizers
  error: subprocess-exited-with-error
  
  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> See above for output.
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  Building wheel for tokenizers (pyproject.toml) ... error
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

nvm got it fixed