octimot/StoryToolkitAI

CUDA supported Torch is uninstalled due to typing_extensions==4.7

Closed this issue · 5 comments

eng3 commented

Describe the bug
Following detailed instructions on windows. When running the extra commands to uninstall and reinstall torch with CUDA, it upgrades typing_extensions to the latest version. Upon starting storytoolkitai, it automatically checks requirements and sees the wrong version and reverses it. It uninstalls torch and typing_extensions and installs 4.7 and the non-cuda version.
As a workaround, I tried changing to typing_extensions>=4.7 and it seems to run, but I don't know the real consequences.

To Reproduce
Steps to reproduce the behavior:

  1. Follow detailed instructions
  2. run extra steps to install CUDA version of torch
  3. Start storytoolkitai and watch torch get reverted

Expected behavior
It should not uninstall torch with cuda

System (please complete the following information):

  • OS: Windows 11
  • Python versions installed on machine: 3.10
  • StoryToolkitAI Version Latest GIt pull

Thanks for reporting this! I'll check if anything breaks when using typing_extensions>=4.7 and update soon!

Cheers

Would you mind opening requirements.txt and changing typing_extensions==4.7 to typing_extensions>=4.12.2 ?

This should solve the issue and allow you to install torch cuda further down the line. Based on our tests over here it's not conflicting with other packages, so if this works on your end, I'll push the patch.

Cheers!

eng3 commented

I think I mentioned in my original message that this is how I got it to work. I just wasnt sure of the consequences of selecting a higher version since the version number was explicitly set in requirements

Right, I see!

It seems that the strict version was added after we switched to a newer version of the openai package, but this package seems to work fine with newer versions of typing_extensions. Besides that, it doesn't seem to affect the tool in any way, at least not on our side...

Thanks!

I just pushed d419309 which should patch this issue.

Thanks for the feedback!