redotvideo/mamba-chat

How could I run this on windows 10?

KevinRyu opened this issue · 5 comments

Hello,

When I tried to install packages with requirements.txt, I got the following error.

Error
ERROR: Could not find a version that satisfies the requirement triton (from mamba-ssm) (from versions: none)
ERROR: No matching distribution found for triton

As I know, triton package supports something like linux only.
What should I do?

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

Does Mamba Chat only run on a GPU?

@SzaremehrjardiMT Currently yes. There's an open issue in llama.cpp to support the mamba architecture, though, which would make it possible to run without a GPU: ggerganov/llama.cpp#4353

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

  • Because of security or several reason, we are not allowed to use operating system except windows 10 in our company. And we are not allowed to use WSL2. I'm sorry to ask supporting windows OS. I have a company desktop with GPU. But I'm struggling with something like these troubles related to OS. I successfully run colab notebook in company outside. Thanks.

@KevinRyu It won't be optimized, but you can try mamba-minimal