kagisearch/pyllms

Exception when calling aleph alpha model?

Terranic opened this issue · 1 comments

Hi all,

I´m getting the following exception when initializing the aleph alpha model:

Found luminous-extended in AlephAlphaProvider
127.0.0.1 - - [11/Sep/2023 21:20:51] "POST /init HTTP/1.1" 500 -
Traceback (most recent call last):
  File "/home/cw/LLM_BACKEND/LLMB_backend.py", line 164, in init
    pyllmsModel=llms.init(alephalpha_api_key=aleph_alpha_key, model=model)
  File "/home/cw/.local/lib/python3.10/site-packages/llms/__init__.py", line 8, in init
    return LLMS(*args, **kwargs)
  File "/home/cw/.local/lib/python3.10/site-packages/llms/llms.py", line 70, in __init__
    self._providers.append(provider.provider(api_key=provider.api_key, model=single_model))
  File "/home/cw/.local/lib/python3.10/site-packages/llms/providers/aleph.py", line 28, in __init__
    self.async_client = AsyncClient(api_key)
  File "/home/cw/.local/lib/python3.10/site-packages/aleph_alpha_client/aleph_alpha_client.py", line 618, in __init__
    self.session = RetryClient(
  File "/home/cw/.local/lib/python3.10/site-packages/aiohttp_retry/client.py", line 193, in __init__
    client = ClientSession(*args, **kwargs)
  File "/home/cw/.local/lib/python3.10/site-packages/aiohttp/client.py", line 228, in __init__
    loop = get_running_loop(loop)
  File "/home/cw/.local/lib/python3.10/site-packages/aiohttp/helpers.py", line 288, in get_running_loop
    loop = asyncio.get_event_loop()
  File "/usr/lib/python3.10/asyncio/events.py", line 656, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'Thread-20 (process_request_thread)'.

is that issue known?
I don´t face this problem mit openai models, like gpt-3.5-turbo
Thanks!

Currently we are open to PRs, pretty busy with other stuff.