Stonelinks/llama-cpp-python

module API server should be importable

Stonelinks opened this issue · 0 comments

The server located at __main__.py can't be imported (I think since its in a __main__.py).

>>> from llama_cpp.server import app
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: cannot import name 'app' from 'llama_cpp.server' (unknown location)

It would be nice for app to be importable so users could stick them behind a blueprint or some other wsgi application framework without a proxy server.