mudler/LocalAI

windows compatibility?

anm5704 opened this issue · 6 comments

I'm a beginner , is this program compatible with windows? what are the neccessary steps. I have the alpaca.cpp already installed on my laptop

mudler commented

Hi!

I don't have a windows machine to double check, but I don't think there should be anything specific preventing it to work there.

Just install docker and start in API mode https://github.com/go-skynet/llama-cli#advanced-usage, or I guess WSL should work too.

Thank you, I ran it after installing docker
docker run -ti --rm quay.io/go-skynet/llama-cli:v0.1 --instruction "What's an alpaca?" --topk 10000
I ran this command but it seems to start the process for 5-6 seconds and ends it without displaying any output,
the same thing happens in API, the localhost site doesnt even load and the process stops are 5-6 without output. Any idea what might be causing this

Thank you, I ran it after installing docker docker run -ti --rm quay.io/go-skynet/llama-cli:v0.1 --instruction "What's an alpaca?" --topk 10000 I ran this command but it seems to start the process for 5-6 seconds and ends it without displaying any output, the same thing happens in API, the localhost site doesnt even load and the process stops are 5-6 without output. Any idea what might be causing this

I was able to fix this by allocating more memory in Docker Desktop settings:

image

hello! thanks, it seems in windows you have to create a .wslconfig in order to increase the limits. But its working now.
II'm confused tho, the API is for LLaMa right? and not the fine-tuned alpaca models (trained on gpt-3 data).

mudler commented

the api should work for both, although I've tested only alpaca models. The new version 0.3 has an --alpaca boolean flag too.

mudler commented

I guess we can close this now? Thanks @stasadance for pointing us in the right direction!