ggozad/oterm

Application crashes if model is not available in ollama anymore but has open session

Closed this issue · 4 comments

If I start oterm which has already open sessions for a llama model which is not available anymore, oterm crashes.

➜ ~ OLLAMA_URL=http://127.0.0.1:11434 OTERM_VERIFY_SSL=False oterm ╭─────────────────────────────────────────────────────────────── Traceback (most recent call last) ────────────────────────────────────────────────────────────────╮ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/oterm/app/oterm.py:146 in on_mount │ │ │ │ 143 │ │ await chat.action_regenerate_llm_message() ╭────────────────────── locals ───────────────────────╮ │ │ 144 │ │ self = OTerm(title='oTerm', classes={'-dark-mode'}) │ │ │ 145 │ async def on_mount(self) -> None: ╰─────────────────────────────────────────────────────╯ │ │ ❱ 146 │ │ store = await Store.get_store() │ │ 147 │ │ self.dark = appConfig.get("theme") == "dark" │ │ 148 │ │ saved_chats = await store.get_chats() │ │ 149 │ │ await self.push_screen(SplashScreen()) │ │ │ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/oterm/store/store.py:47 in get_store │ │ │ │ 44 │ │ │ │ │ db_version │ │ 45 │ │ │ │ ): │ │ 46 │ │ │ │ │ for step in steps: │ │ ❱ 47 │ │ │ │ │ │ await step(self.db_path) │ │ 48 │ │ │ await self.set_user_version(current_version) │ │ 49 │ │ cls._store = self │ │ 50 │ │ return self │ │ │ │ ╭────────────────────────────────────── locals ──────────────────────────────────────╮ │ │ │ cls = <class 'oterm.store.store.Store'> │ │ │ │ current_version = '0.5.2' │ │ │ │ data_path = PosixPath('/Users/neilschark/Library/Application Support/oterm') │ │ │ │ db_version = '0.2.10' │ │ │ │ self = <oterm.store.store.Store object at 0x1071678c0> │ │ │ │ step = <function parameters at 0x10691a200> │ │ │ │ steps = [<function parameters at 0x10691a200>] │ │ │ │ version = '0.3.0' │ │ │ ╰────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/oterm/store/upgrades/v0_3_0.py:24 in parameters │ │ │ │ 21 │ │ # Update with default parameters │ │ 22 │ │ chat_models = await connection.execute_fetchall("SELECT id, model FROM chat") │ │ 23 │ │ for chat_id, model in chat_models: │ │ ❱ 24 │ │ │ info = OllamaLLM.show(model) │ │ 25 │ │ │ parameters = parse_ollama_parameters(info["parameters"]) │ │ 26 │ │ │ await connection.execute( │ │ 27 │ │ │ │ "UPDATE chat SET parameters = ? WHERE id = ?", │ │ │ │ ╭──────────────────────────────────────── locals ─────────────────────────────────────────╮ │ │ │ chat_id = 1 │ │ │ │ chat_models = [(1, 'llama3:latest'), (6, 'llama3:latest')] │ │ │ │ connection = <Connection(Thread-4, stopped 6169243648)> │ │ │ │ db_path = PosixPath('/Users/neilschark/Library/Application Support/oterm/store.db') │ │ │ │ model = 'llama3:latest' │ │ │ ╰─────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/oterm/ollamaclient.py:92 in show │ │ │ │ 89 │ @staticmethod ╭──────────────────────── locals ────────────────────────╮ │ │ 90 │ def show(model: str) -> Mapping[str, Any]: │ client = <ollama._client.Client object at 0x1070f8bf0> │ │ │ 91 │ │ client = Client(host=envConfig.OLLAMA_URL, verify=envConfig.OTERM_VERIFY_SSL) │ model = 'llama3:latest' │ │ │ ❱ 92 │ │ return client.show(model) ╰────────────────────────────────────────────────────────╯ │ │ 93 │ │ 94 │ │ 95 def parse_ollama_parameters(parameter_text: str) -> Options: │ │ │ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/ollama/_client.py:471 in show │ │ │ │ 468 │ return {'status': 'success' if response.status_code == 200 else 'error'} ╭─────────────────────── locals ────────────────────────╮ │ │ 469 │ model = 'llama3:latest' │ │ │ 470 def show(self, model: str) -> Mapping[str, Any]: │ self = <ollama._client.Client object at 0x1070f8bf0> │ │ │ ❱ 471 │ return self._request('POST', '/api/show', json={'name': model}).json() ╰───────────────────────────────────────────────────────╯ │ │ 472 │ │ 473 def ps(self) -> Mapping[str, Any]: │ │ 474 │ return self._request('GET', '/api/ps').json() │ │ │ │ /opt/homebrew/Cellar/oterm/0.5.2/libexec/lib/python3.12/site-packages/ollama/_client.py:74 in _request │ │ │ │ 71 │ try: ╭───────────────────────── locals ─────────────────────────╮ │ │ 72 │ response.raise_for_status() │ kwargs = {'json': {'name': 'llama3:latest'}} │ │ │ 73 │ except httpx.HTTPStatusError as e: │ method = 'POST' │ │ │ ❱ 74 │ raise ResponseError(e.response.text, e.response.status_code) from None │ response = <Response [404 Not Found]> │ │ │ 75 │ │ self = <ollama._client.Client object at 0x1070f8bf0> │ │ │ 76 │ return response │ url = '/api/show' │ │ │ 77 ╰──────────────────────────────────────────────────────────╯ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ResponseError: model 'llama3:latest' not found

Hey thanks! I am currently on a short vacation will fix asap.

No worries, enjoy your vacation, that's more important.

ggozad commented

Hey there! Sorry for the delay ;)
I looked briefly into this. oterm indeed fails when a model is not present, but does not crash. I will fix this eventually, but would like to land support for a number of other things first.
For the time being and as a workaround the obvious thing is to download the model or delete the chat.

Now in your case, it seems that this happens when oterm tries to upgrade the database from an old version of oterm and crashes as it cannot get the default parameters for this model.
Could you please verify this is the case?
To fix your problem I would suggest deleting the database and start from scratch. To do so, you need to delete the sqlite db found at ~/Library/Application Support/oterm/store.db.
Alternatively if you don't want to loose the other conversations you have stored, you can use something like "DB Browser for sqlite" to delete the erroneous chat manually.
Please let me know if this helps.
I will keep this open and fix it in the near future.

Hello, I hope you had a nice vacation.

Thank you, cleaning up my history indeed works as a workaround!