api errors with "Could not find the character "Alpaca" error
Opened this issue · 14 comments
~~> nix run .#textgen-nvidia -- --api --auto-launch
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
2023-12-12 23:44:37 ERROR:Could not find the character "Alpaca" inside instruction-templates/. No character has been loaded.
Traceback (most recent call last):
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/gradio/routes.py", line 414, in run_predict
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/gradio/blocks.py", line 1323, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/gradio/blocks.py", line 1051, in call_function
prediction = await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2106, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 833, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/chat.py", line 561, in load_character
raise ValueError
ValueError
I'm using 63339e4.
Possibly related:
I think maybe the api is just not working at all, nothing is listening on port 5000.
Can anyone else confirm otherwise?
@ParetoOptimalDev can you try out #71 and test if it works and doesn't break anything else? If it does, I'll go ahead and merge. Make sure to rm -rf ~/.textgen
to perform that test.
I've been debating removing text-generation-webui in favor of https://github.com/imartinez/privateGPT, as it will be easier to maintain and more straight forward to use for the basic tasks. Text-generation-webui has a lot of undeclared dependencies and magical behavior at runtime that will never be fully encapsulated by our effort here.
Also, as mentioned in other PRs, we need VM tests, as it will prevent these kinds of issues happening in the future.
I tried #71 last night and when textgen loaded it was stuck on a prompt of some sort. I'm pretty sure I deleted ~/.textgen
but not positive.
I'll test that again.
First though, since you mentioned moving to privateGPT, I'll test that. It should just be this right?
nix run github:MatthewCroughan/privateGPT
We'll see I guess, it's currently building.
Also, do you know if privateGPT supports exllamav2? I'm basically just trying to test perf improvments of llamav2 over openai compatible api, I don't really care what with.
@ParetoOptimalDev yeah I have a flake at that URL which you found, and it should work, let me know if it doesn't
@ParetoOptimalDev yeah I have a flake at that URL which you found, and it should work, let me know if it doesn't
I'll have to clear some space after all of these copies of cuda ;)
Text-generation-webui has a lot of undeclared dependencies and magical behavior at runtime that will never be fully encapsulated by our effort here
Can you give me some examples of these? I'm trying to get better at packaging python applications in Nix and this would be really valuable for me to understand.
Well a good example is that when it launches, it offers you the ability on the webui to choose between different models with different quantization methods, gguf, gptq, bitsandbytes. It doesn't ask for those during "install time" because it doesn't formally have a pyproject.toml or some machine-readable or parseable format that allows us to detect what it wants. Instead, it'll just allow the user to choose between these methods, and then obviously that has a dependency on packaging either llama-cpp, autogptq, or bitandbytes, which are complex programs to package in their own right. And so it will blow up at runtime since we didn't know and couldn't know it wanted these things in order to run.
It should be added that quantization methods are changing all the time, as well as the runtime dependencies, and it's very hard to keep track of the missing undeclared information over time as the upstream project develops. Whereas privateGPT actually has a pyproject.toml and poetry.lock that can be read and automatically parsed by Nix, or any other package manager for that matter.
Thanks!
Also, your privateGPT flake isn't working for me. It seems to not have any binaries and also the site_packages in the nix shell only has privateGPT:
$ nix run --refresh github:MatthewCroughan/privateGPT
error: unable to execute '/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/bin/private-gpt': No such file or directory
@ParetoOptimalDev it's a poetry2nix project, it's nix develop
and then you can python -m private_gpt
Hm... okay... trying that gets me:
Me wasting time not understanding you are meant to clone the repo
/tmp $ nix develop github:MatthewCroughan/privateGPT -c python -m private_gpt
14:48:58.761 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/__main__.py", line 5, in <module>
from private_gpt.main import app
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/main.py", line 5, in <module>
from private_gpt.di import global_injector
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/di.py", line 3, in <module>
from private_gpt.settings.settings import Settings, unsafe_typed_settings
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings.py", line 238, in <module>
unsafe_settings = load_active_settings()
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 53, in load_active_settings
loaded_profiles = [
^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 54, in <listcomp>
load_settings_from_profile(profile) for profile in active_profiles
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 43, in load_settings_from_profile
with Path(path).open("r") as f:
^^^^^^^^^^^^^^^^^^^^
File "/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11/pathlib.py", line 1044, in open
return io.open(self, mode, buffering, encoding, errors, newline)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/settings.yaml'
/tmp [1] $ touch settings.yaml
/tmp $ nix develop github:MatthewCroughan/privateGPT -c python -m private_gpt
14:49:14.279 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/__main__.py", line 5, in <module>
from private_gpt.main import app
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/main.py", line 5, in <module>
from private_gpt.di import global_injector
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/di.py", line 3, in <module>
from private_gpt.settings.settings import Settings, unsafe_typed_settings
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings.py", line 238, in <module>
unsafe_settings = load_active_settings()
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 53, in load_active_settings
loaded_profiles = [
^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 54, in <listcomp>
load_settings_from_profile(profile) for profile in active_profiles
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/settings/settings_loader.py", line 46, in load_settings_from_profile
raise TypeError(f"Config file has no top-level mapping: {path}")
TypeError: Config file has no top-level mapping: /tmp/settings.yaml
/tmp [1] $ curl -LO https://raw.githubusercontent.com/MatthewCroughan/privateGPT/main/settings.yaml
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1920 100 1920 0 0 20200 0 --:--:-- --:--:-- --:--:-- 20210
/tmp $ nix develop github:MatthewCroughan/privateGPT -c python -m private_gpt
14:50:22.396 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/__main__.py", line 5, in <module>
from private_gpt.main import app
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/main.py", line 11, in <module>
app = create_app(global_injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/launcher.py", line 48, in create_app
from private_gpt.ui.ui import PrivateGptUi
File "/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/ui/ui.py", line 25, in <module>
THIS_DIRECTORY_RELATIVE = Path(__file__).parent.relative_to(PROJECT_ROOT_PATH)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11/pathlib.py", line 730, in relative_to
raise ValueError("{!r} is not in the subpath of {!r}"
ValueError: '/nix/store/a19cq9qjqwy0x433qvvd4si487cpikv5-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages/private_gpt/ui' is not in the subpath of '/tmp' OR one path is relative and the other is absolute.
Okay, so now I now the repo should be downloaded:
Getting an error about models path not exiting
/tmp $ cd privateGPT/
/tmp/privateGPT $ nix develop -c python -m private_gpt
14:52:35.917 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
14:52:37.606 [INFO ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=local
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.components.llm.llm_component.LLMComponent'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/tmp/privateGPT/private_gpt/__main__.py", line 5, in <module>
from private_gpt.main import app
File "/tmp/privateGPT/private_gpt/main.py", line 11, in <module>
app = create_app(global_injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/privateGPT/private_gpt/launcher.py", line 50, in create_app
ui = root_injector.get(PrivateGptUi)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1040, in call_with_injection
return callable(*full_args, **dependencies)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/privateGPT/private_gpt/components/llm/llm_component.py", line 28, in __init__
self.llm = LlamaCPP(
^^^^^^^^^
File "/nix/store/qyn2zjd6q3231lca64z1dxv04wa6gi1l-python3.11-llama-index-0.9.3/lib/python3.11/site-packages/llama_index/llms/llama_cpp.py", line 119, in __init__
raise ValueError(
ValueError: Provided model path does not exist. Please check the path or provide a model_url to download.
So then I looked at the settings.yaml for about 10 minutes before figuring out the path it probably wants.
Finally get it to load a model... but then.. "OSError: libcufft.so.11: cannot open shared object file: No such file or directory"... and I'm done with this for now :)
/tmp/privateGPT [1] $ nix develop -c python -m private_gpt
14:58:25.131 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
14:58:26.768 [INFO ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=local
ggml_init_cublas: GGML_CUDA_FORCE_MMQ: no
ggml_init_cublas: CUDA_USE_TENSOR_CORES: yes
ggml_init_cublas: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3060 Ti, compute capability 8.6
llama_model_loader: loaded meta data with 19 key-value pairs and 291 tensors from /tmp/privateGPT/models/openhermes-2-mistral-7b.Q4_K_M.gguf (version GGUF V2)
llama_model_loader: - tensor 0: token_embd.weight q4_K [ 4096, 32002, 1, 1 ]
llama_model_loader: - tensor 1: blk.0.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 2: blk.0.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 3: blk.0.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 4: blk.0.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 5: blk.0.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 6: blk.0.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 7: blk.0.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 8: blk.0.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 9: blk.0.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 10: blk.1.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 11: blk.1.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 12: blk.1.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 13: blk.1.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 14: blk.1.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 15: blk.1.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 16: blk.1.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 17: blk.1.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 18: blk.1.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 19: blk.2.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 20: blk.2.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 21: blk.2.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 22: blk.2.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 23: blk.2.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 24: blk.2.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 25: blk.2.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 26: blk.2.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 27: blk.2.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 28: blk.3.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 29: blk.3.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 30: blk.3.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 31: blk.3.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 32: blk.3.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 33: blk.3.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 34: blk.3.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 35: blk.3.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 36: blk.3.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 37: blk.4.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 38: blk.4.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 39: blk.4.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 40: blk.4.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 41: blk.4.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 42: blk.4.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 43: blk.4.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 44: blk.4.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 45: blk.4.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 46: blk.5.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 47: blk.5.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 48: blk.5.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 49: blk.5.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 50: blk.5.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 51: blk.5.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 52: blk.5.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 53: blk.5.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 54: blk.5.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 55: blk.6.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 56: blk.6.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 57: blk.6.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 58: blk.6.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 59: blk.6.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 60: blk.6.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 61: blk.6.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 62: blk.6.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 63: blk.6.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 64: blk.7.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 65: blk.7.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 66: blk.7.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 67: blk.7.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 68: blk.7.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 69: blk.7.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 70: blk.7.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 71: blk.7.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 72: blk.7.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 73: blk.8.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 74: blk.8.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 75: blk.8.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 76: blk.8.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 77: blk.8.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 78: blk.8.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 79: blk.8.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 80: blk.8.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 81: blk.8.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 82: blk.9.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 83: blk.9.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 84: blk.9.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 85: blk.9.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 86: blk.9.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 87: blk.9.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 88: blk.9.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 89: blk.9.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 90: blk.9.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 91: blk.10.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 92: blk.10.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 93: blk.10.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 94: blk.10.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 95: blk.10.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 96: blk.10.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 97: blk.10.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 98: blk.10.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 99: blk.10.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 100: blk.11.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 101: blk.11.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 102: blk.11.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 103: blk.11.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 104: blk.11.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 105: blk.11.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 106: blk.11.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 107: blk.11.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 108: blk.11.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 109: blk.12.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 110: blk.12.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 111: blk.12.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 112: blk.12.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 113: blk.12.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 114: blk.12.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 115: blk.12.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 116: blk.12.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 117: blk.12.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 118: blk.13.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 119: blk.13.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 120: blk.13.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 121: blk.13.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 122: blk.13.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 123: blk.13.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 124: blk.13.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 125: blk.13.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 126: blk.13.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 127: blk.14.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 128: blk.14.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 129: blk.14.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 130: blk.14.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 131: blk.14.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 132: blk.14.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 133: blk.14.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 134: blk.14.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 135: blk.14.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 136: blk.15.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 137: blk.15.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 138: blk.15.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 139: blk.15.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 140: blk.15.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 141: blk.15.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 142: blk.15.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 143: blk.15.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 144: blk.15.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 145: blk.16.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 146: blk.16.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 147: blk.16.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 148: blk.16.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 149: blk.16.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 150: blk.16.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 151: blk.16.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 152: blk.16.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 153: blk.16.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 154: blk.17.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 155: blk.17.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 156: blk.17.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 157: blk.17.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 158: blk.17.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 159: blk.17.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 160: blk.17.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 161: blk.17.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 162: blk.17.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 163: blk.18.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 164: blk.18.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 165: blk.18.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 166: blk.18.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 167: blk.18.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 168: blk.18.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 169: blk.18.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 170: blk.18.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 171: blk.18.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 172: blk.19.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 173: blk.19.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 174: blk.19.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 175: blk.19.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 176: blk.19.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 177: blk.19.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 178: blk.19.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 179: blk.19.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 180: blk.19.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 181: blk.20.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 182: blk.20.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 183: blk.20.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 184: blk.20.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 185: blk.20.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 186: blk.20.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 187: blk.20.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 188: blk.20.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 189: blk.20.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 190: blk.21.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 191: blk.21.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 192: blk.21.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 193: blk.21.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 194: blk.21.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 195: blk.21.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 196: blk.21.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 197: blk.21.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 198: blk.21.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 199: blk.22.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 200: blk.22.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 201: blk.22.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 202: blk.22.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 203: blk.22.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 204: blk.22.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 205: blk.22.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 206: blk.22.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 207: blk.22.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 208: blk.23.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 209: blk.23.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 210: blk.23.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 211: blk.23.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 212: blk.23.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 213: blk.23.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 214: blk.23.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 215: blk.23.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 216: blk.23.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 217: blk.24.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 218: blk.24.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 219: blk.24.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 220: blk.24.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 221: blk.24.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 222: blk.24.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 223: blk.24.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 224: blk.24.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 225: blk.24.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 226: blk.25.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 227: blk.25.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 228: blk.25.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 229: blk.25.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 230: blk.25.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 231: blk.25.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 232: blk.25.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 233: blk.25.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 234: blk.25.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 235: blk.26.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 236: blk.26.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 237: blk.26.attn_v.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 238: blk.26.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 239: blk.26.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 240: blk.26.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 241: blk.26.ffn_down.weight q4_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 242: blk.26.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 243: blk.26.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 244: blk.27.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 245: blk.27.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 246: blk.27.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 247: blk.27.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 248: blk.27.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 249: blk.27.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 250: blk.27.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 251: blk.27.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 252: blk.27.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 253: blk.28.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 254: blk.28.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 255: blk.28.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 256: blk.28.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 257: blk.28.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 258: blk.28.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 259: blk.28.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 260: blk.28.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 261: blk.28.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 262: blk.29.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 263: blk.29.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 264: blk.29.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 265: blk.29.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 266: blk.29.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 267: blk.29.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 268: blk.29.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 269: blk.29.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 270: blk.29.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 271: blk.30.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 272: blk.30.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 273: blk.30.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 274: blk.30.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 275: blk.30.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 276: blk.30.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 277: blk.30.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 278: blk.30.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 279: blk.30.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 280: blk.31.attn_q.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 281: blk.31.attn_k.weight q4_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 282: blk.31.attn_v.weight q6_K [ 4096, 1024, 1, 1 ]
llama_model_loader: - tensor 283: blk.31.attn_output.weight q4_K [ 4096, 4096, 1, 1 ]
llama_model_loader: - tensor 284: blk.31.ffn_gate.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 285: blk.31.ffn_up.weight q4_K [ 4096, 14336, 1, 1 ]
llama_model_loader: - tensor 286: blk.31.ffn_down.weight q6_K [ 14336, 4096, 1, 1 ]
llama_model_loader: - tensor 287: blk.31.attn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 288: blk.31.ffn_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 289: output_norm.weight f32 [ 4096, 1, 1, 1 ]
llama_model_loader: - tensor 290: output.weight q6_K [ 4096, 32002, 1, 1 ]
llama_model_loader: - kv 0: general.architecture str
llama_model_loader: - kv 1: general.name str
llama_model_loader: - kv 2: llama.context_length u32
llama_model_loader: - kv 3: llama.embedding_length u32
llama_model_loader: - kv 4: llama.block_count u32
llama_model_loader: - kv 5: llama.feed_forward_length u32
llama_model_loader: - kv 6: llama.rope.dimension_count u32
llama_model_loader: - kv 7: llama.attention.head_count u32
llama_model_loader: - kv 8: llama.attention.head_count_kv u32
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32
llama_model_loader: - kv 10: llama.rope.freq_base f32
llama_model_loader: - kv 11: general.file_type u32
llama_model_loader: - kv 12: tokenizer.ggml.model str
llama_model_loader: - kv 13: tokenizer.ggml.tokens arr
llama_model_loader: - kv 14: tokenizer.ggml.scores arr
llama_model_loader: - kv 15: tokenizer.ggml.token_type arr
llama_model_loader: - kv 16: tokenizer.ggml.bos_token_id u32
llama_model_loader: - kv 17: tokenizer.ggml.eos_token_id u32
llama_model_loader: - kv 18: general.quantization_version u32
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_K: 193 tensors
llama_model_loader: - type q6_K: 33 tensors
llm_load_vocab: special tokens definition check successful ( 261/32002 ).
llm_load_print_meta: format = GGUF V2
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32002
llm_load_print_meta: n_merges = 0
llm_load_print_meta: n_ctx_train = 32768
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 32768
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = mostly Q4_K - Medium
llm_load_print_meta: model params = 7.24 B
llm_load_print_meta: model size = 4.07 GiB (4.83 BPW)
llm_load_print_meta: general.name = teknium_openhermes-2-mistral-7b
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 32000 '<|im_end|>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: LF token = 13 '<0x0A>'
llm_load_tensors: ggml ctx size = 0.11 MB
llm_load_tensors: using CUDA for GPU acceleration
llm_load_tensors: mem required = 70.42 MB
llm_load_tensors: offloading 32 repeating layers to GPU
llm_load_tensors: offloading non-repeating layers to GPU
llm_load_tensors: offloaded 35/35 layers to GPU
llm_load_tensors: VRAM used: 4095.06 MB
...............................................................................................
llama_new_context_with_model: n_ctx = 3900
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: offloading v cache to GPU
llama_kv_cache_init: offloading k cache to GPU
llama_kv_cache_init: VRAM kv self = 487.50 MB
llama_new_context_with_model: kv self size = 487.50 MB
llama_build_graph: non-view tensors processed: 740/740
llama_new_context_with_model: compute buffer total size = 276.93 MB
llama_new_context_with_model: VRAM scratch buffer: 275.37 MB
llama_new_context_with_model: total VRAM used: 4857.93 MB (model: 4095.06 MB, context: 762.87 MB)
AVX = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 |
14:58:49.269 [INFO ] private_gpt.components.embedding.embedding_component - Initializing the embedding model in mode=local
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 798, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.components.embedding.embedding_component.EmbeddingComponent'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/nix/store/cq51xmnc4hrki59nqxrx8y2xdr2q18fl-python3.11-torch-2.1.1/lib/python3.11/site-packages/torch/__init__.py", line 174, in _load_global_deps
ctypes.CDLL(lib_path, mode=ctypes.RTLD_GLOBAL)
File "/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11/ctypes/__init__.py", line 376, in __init__
self._handle = _dlopen(self._name, mode)
^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: libcufft.so.11: cannot open shared object file: No such file or directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/tmp/privateGPT/private_gpt/__main__.py", line 5, in <module>
from private_gpt.main import app
File "/tmp/privateGPT/private_gpt/main.py", line 11, in <module>
app = create_app(global_injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/privateGPT/private_gpt/launcher.py", line 50, in create_app
ui = root_injector.get(PrivateGptUi)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1031, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1079, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 974, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 800, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 811, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 998, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages/injector/__init__.py", line 1040, in call_with_injection
return callable(*full_args, **dependencies)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/privateGPT/private_gpt/components/embedding/embedding_component.py", line 25, in __init__
self.embedding_model = HuggingFaceEmbedding(
^^^^^^^^^^^^^^^^^^^^^
File "/nix/store/qyn2zjd6q3231lca64z1dxv04wa6gi1l-python3.11-llama-index-0.9.3/lib/python3.11/site-packages/llama_index/embeddings/huggingface.py", line 65, in __init__
from transformers import AutoModel, AutoTokenizer
File "/nix/store/86cb9yq9bjazz03zx07b627ii0d1by3p-python3.11-transformers-4.35.2/lib/python3.11/site-packages/transformers/__init__.py", line 26, in <module>
from . import dependency_versions_check
File "/nix/store/86cb9yq9bjazz03zx07b627ii0d1by3p-python3.11-transformers-4.35.2/lib/python3.11/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
from .utils.versions import require_version, require_version_core
File "/nix/store/86cb9yq9bjazz03zx07b627ii0d1by3p-python3.11-transformers-4.35.2/lib/python3.11/site-packages/transformers/utils/__init__.py", line 31, in <module>
from .generic import (
File "/nix/store/86cb9yq9bjazz03zx07b627ii0d1by3p-python3.11-transformers-4.35.2/lib/python3.11/site-packages/transformers/utils/generic.py", line 432, in <module>
import torch.utils._pytree as _torch_pytree
File "/nix/store/cq51xmnc4hrki59nqxrx8y2xdr2q18fl-python3.11-torch-2.1.1/lib/python3.11/site-packages/torch/__init__.py", line 234, in <module>
_load_global_deps()
File "/nix/store/cq51xmnc4hrki59nqxrx8y2xdr2q18fl-python3.11-torch-2.1.1/lib/python3.11/site-packages/torch/__init__.py", line 195, in _load_global_deps
_preload_cuda_deps(lib_folder, lib_name)
File "/nix/store/cq51xmnc4hrki59nqxrx8y2xdr2q18fl-python3.11-torch-2.1.1/lib/python3.11/site-packages/torch/__init__.py", line 160, in _preload_cuda_deps
raise ValueError(f"{lib_name} not found in the system path {sys.path}")
ValueError: libcusolver.so.*[0-9] not found in the system path ['/tmp/privateGPT', '/nix/store/dhhllfdyl4g8f7apd694dw8q6nb83gwa-python3.11-private-gpt-0.1.0/lib/python3.11/site-packages', '/nix/store/6kk8k5apicrn0bplhf9zy8mf3wya3m1x-python3.11-boto3-1.29.3/lib/python3.11/site-packages', '/nix/store/a2lmyhhg2777k28i7khwnm0d35jjpdhd-python3.11-botocore-1.32.3/lib/python3.11/site-packages', '/nix/store/7m70dihl9frihpmq7lrw43aaxqx85jvw-python3.11-jmespath-1.0.1/lib/python3.11/site-packages', '/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11/site-packages', '/nix/store/mmgf7kdr4ipjh22nb2z3p7smhwfmxcmc-python3.11-python-dateutil-2.8.2/lib/python3.11/site-packages', '/nix/store/9bwi6yzdvk2pya9pll1fx0zzpp56i2ly-python3.11-six-1.16.0/lib/python3.11/site-packages', '/nix/store/m9f6rakhaimm7s2cdynng8z8znqpnwzj-python3.11-urllib3-1.26.18/lib/python3.11/site-packages', '/nix/store/8lkfwczlll00mb1xfrdc5mgijcm0sbrw-python3.11-s3transfer-0.7.0/lib/python3.11/site-packages', '/nix/store/ggmm89iksripmqcvnh4b617swcb5l3ws-python3.11-chromadb-0.4.17/lib/python3.11/site-packages', '/nix/store/49wjc9l5aqss8gy0fz69xj73513b5xj1-python3.11-pyyaml-6.0.1/lib/python3.11/site-packages', '/nix/store/0vd5xdkl5gm9wy2q6mp236dj67x18ddc-python3.11-bcrypt-4.0.1/lib/python3.11/site-packages', '/nix/store/jdpixdi02m0xvyx1bbgg13vsrpay6xy4-python3.11-chroma-hnswlib-0.7.3/lib/python3.11/site-packages', '/nix/store/r4njqk18jw8iz7bxhy2w6z28nlmriyzh-python3.11-numpy-1.26.0/lib/python3.11/site-packages', '/nix/store/zw8cahjklrxzk57vcswgdf5xa3kq4x57-python3.11-fastapi-0.103.2/lib/python3.11/site-packages', '/nix/store/43h16yvnp4avdn5yfmpjyhvvfyaa728m-python3.11-anyio-3.7.1/lib/python3.11/site-packages', '/nix/store/gv4j5smrrrph0djxjvp00n532ms52ydk-python3.11-idna-3.4/lib/python3.11/site-packages', '/nix/store/qs2b58ymhyf0ifmnd86vmpk91cvicl8s-python3.11-sniffio-1.3.0/lib/python3.11/site-packages', '/nix/store/qylbadzz4h0a6jnzdvq9rw8p0l2lax1m-python3.11-email-validator-2.1.0.post1/lib/python3.11/site-packages', '/nix/store/wx3r52d4i5xyr5r67jy9bhs1qmyd7rck-python3.11-dnspython-2.4.2/lib/python3.11/site-packages', '/nix/store/8l4h2g9x3rp0cplj5all0ijnp7lxggyk-python3.11-httpx-0.25.1/lib/python3.11/site-packages', '/nix/store/czql6cyiz0qxvdwsihg33jpsr105v6vp-python3.11-certifi-2023.11.17/lib/python3.11/site-packages', '/nix/store/w9528ppdya49fck35j4l78sdvflcmfx8-python3.11-h2-4.1.0/lib/python3.11/site-packages', '/nix/store/0mjmkxba4cj8l29y8hgkfv9wapx7jj0p-python3.11-hpack-4.0.0/lib/python3.11/site-packages', '/nix/store/ar79550m3mpm2bn7k7n7pka7lrj4s9jc-python3.11-hyperframe-6.0.1/lib/python3.11/site-packages', '/nix/store/pgg23i2ncmn2z5zyciw2pzqzkrgdb0jr-python3.11-httpcore-1.0.2/lib/python3.11/site-packages', '/nix/store/jh278m9alrra1xh1xb6frj8cqf4ni0f3-python3.11-h11-0.14.0/lib/python3.11/site-packages', '/nix/store/nr1ha1kqriks38m669wsq2zrnz4h09x4-python3.11-itsdangerous-2.1.2/lib/python3.11/site-packages', '/nix/store/6kycfhjg6fyjbna8q3cvl3d257s84gfw-python3.11-jinja2-3.1.2/lib/python3.11/site-packages', '/nix/store/l2vvhz5k50zaklsf71bn4ffq3g83vjbc-python3.11-markupsafe-2.1.3/lib/python3.11/site-packages', '/nix/store/7z3x488awkm09bj1ihs6971gwpfy234a-python3.11-orjson-3.9.10/lib/python3.11/site-packages', '/nix/store/9z4mgfrwwckcqs574pg15frm8zgyqcs8-python3.11-pydantic-2.5.1/lib/python3.11/site-packages', '/nix/store/nphn40hlkm1npawxjlw7mwgdnkpayns6-python3.11-annotated-types-0.6.0/lib/python3.11/site-packages', '/nix/store/jrakbf2vvrvgb4h5sixvamkg00ah3y9a-python3.11-pydantic-core-2.14.3/lib/python3.11/site-packages', '/nix/store/7rssb67hcbmbnyypda3qv9badvvrlkwx-python3.11-typing-extensions-4.8.0/lib/python3.11/site-packages', '/nix/store/sqlpvkf00qrvw6kfxnrp34zn6ikrnf2m-python3.11-pydantic-extra-types-2.1.0/lib/python3.11/site-packages', '/nix/store/vqc19qmxyi97q94wmmxgrfmz8mfxz1c8-python3.11-pydantic-settings-2.1.0/lib/python3.11/site-packages', '/nix/store/sbqclri5xy0qcb9crp9sici5cr89az3h-python3.11-python-dotenv-1.0.0/lib/python3.11/site-packages', '/nix/store/dppnjspkcz3gvwrrrrp078qiq8sqchwp-python3.11-python-multipart-0.0.6/lib/python3.11/site-packages', '/nix/store/djs1c2qhnq0aydwc6d3i843m033165xs-python3.11-starlette-0.27.0/lib/python3.11/site-packages', '/nix/store/bjil9iqi5f775dkrz3cyzb43gkdz94fp-python3.11-ujson-5.8.0/lib/python3.11/site-packages', '/nix/store/ac7mgw54c591ph3k6qkrg99izhk4aigp-python3.11-uvicorn-0.24.0.post1/lib/python3.11/site-packages', '/nix/store/56ahhsanpz43wr5kjkg6gsakblvvjaha-python3.11-click-8.1.7/lib/python3.11/site-packages', '/nix/store/p7bsslpyhk55kvpvrcwckql6flfyhdpb-python3.11-httptools-0.6.1/lib/python3.11/site-packages', '/nix/store/bn3mb1n22pb0jgmkih0ha5jzmjckiri5-python3.11-uvloop-0.19.0/lib/python3.11/site-packages', '/nix/store/6m76h3pczw3x03ncv24lwfz39pqrg017-python3.11-watchfiles-0.21.0/lib/python3.11/site-packages', '/nix/store/lvfvi5yzmqc5hfjq7lhcnmiyxm378n8r-python3.11-websockets-11.0.3/lib/python3.11/site-packages', '/nix/store/vcf19a80dzm8kk8fa3ywc98i8gayavk6-python3.11-grpcio-1.59.3/lib/python3.11/site-packages', '/nix/store/9bdzi6jkwz32iw86d94cxzag7if9mqms-python3.11-importlib-resources-6.1.1/lib/python3.11/site-packages', '/nix/store/lqvgnj4asnv1izm6h06jacvqmcpan851-python3.11-kubernetes-28.1.0/lib/python3.11/site-packages', '/nix/store/g4snm1krqc4hzz6kgy515ya6baxgaw73-python3.11-google-auth-2.23.4/lib/python3.11/site-packages', '/nix/store/wsdswr7xx6fzswq04ak0nam1prwz6lf9-python3.11-cachetools-5.3.2/lib/python3.11/site-packages', '/nix/store/9qvlssjxyw8y9anq5svpzapc7avw9mza-python3.11-pyasn1-modules-0.3.0/lib/python3.11/site-packages', '/nix/store/whd1zdqk56jk2r8ln7fsx7ka7qbdk82p-python3.11-pyasn1-0.5.0/lib/python3.11/site-packages', '/nix/store/b0j6v1sxj97dy57id76pp4s2qnx87241-python3.11-rsa-4.9/lib/python3.11/site-packages', '/nix/store/jpyk5bgz9hkrsds3ai9ngz00x775dwz5-python3.11-oauthlib-3.2.2/lib/python3.11/site-packages', '/nix/store/v9k4f2al4d4vnqfvnviq5049z0l0glp1-python3.11-requests-2.31.0/lib/python3.11/site-packages', '/nix/store/z1rsg5y1nd4j2xw9sb4kd2k36d3ri0sx-python3.11-charset-normalizer-3.3.2/lib/python3.11/site-packages', '/nix/store/nzvs37zmbx8ixvk07lgi0mf44rvjdsqx-python3.11-requests-oauthlib-1.3.1/lib/python3.11/site-packages', '/nix/store/b01914dyn21ga5866ijfxa74aznaagzb-python3.11-websocket-client-1.6.4/lib/python3.11/site-packages', '/nix/store/pizdvksvm6fyplyzlf7vnjgvkigrichl-python3.11-onnxruntime-1.16.2/lib/python3.11/site-packages', '/nix/store/z061vhw45nrqjzl85b0s8pakync64gw2-python3.11-coloredlogs-15.0.1/lib/python3.11/site-packages', '/nix/store/8padnkxs333ypa71szjj04krzgwz969r-python3.11-humanfriendly-10.0/lib/python3.11/site-packages', '/nix/store/ba13pr6ivbndb3c5p5kkq0ckvn1qbbvi-python3.11-flatbuffers-23.5.26/lib/python3.11/site-packages', '/nix/store/csli8yf0s802wf06rfn91103zazsadyd-python3.11-packaging-23.1/lib/python3.11/site-packages', '/nix/store/5gf0jckd8s234ngmhgr6z8x6i7sisvx8-python3.11-protobuf-4.25.1/lib/python3.11/site-packages', '/nix/store/frrfi5l73kcvpjc4ld8dnflnm3g65da1-python3.11-sympy-1.12/lib/python3.11/site-packages', '/nix/store/23a1y9sbxqrrbcs7qjhkxmhjzrnjhchi-python3.11-mpmath-1.3.0/lib/python3.11/site-packages', '/nix/store/mc5yjis9liz9d0kgwv18m4xnlsbylqmh-python3.11-opentelemetry-api-1.21.0/lib/python3.11/site-packages', '/nix/store/bh72mm08f2965jhz7pmxfic5zaacvnsa-python3.11-deprecated-1.2.14/lib/python3.11/site-packages', '/nix/store/dpnqbc56wsv2xkhdgpn78j3x2kx577k4-python3.11-wrapt-1.16.0/lib/python3.11/site-packages', '/nix/store/hzjzyzh081gm1mvh7c57qawrlqj20aa4-python3.11-importlib-metadata-6.8.0/lib/python3.11/site-packages', '/nix/store/6gmhn5xd36wdswyjvpjd4n201a6hycyl-python3.11-zipp-3.17.0/lib/python3.11/site-packages', '/nix/store/x1y4j26h25ca73l08cjlqqaqjrglb42z-python3.11-toml-0.10.2/lib/python3.11/site-packages', '/nix/store/vdb7vzlk92dlcipaabc5cbvqa2k0zxc8-python3.11-opentelemetry-exporter-otlp-proto-grpc-1.21.0/lib/python3.11/site-packages', '/nix/store/yhva391hb2c7gs0qx2y0vd5jv4rgifms-python3.11-backoff-2.2.1/lib/python3.11/site-packages', '/nix/store/xylbhf1khflicmfqaaip7ysp3mlacnpn-python3.11-googleapis-common-protos-1.61.0/lib/python3.11/site-packages', '/nix/store/qfk5770k9f2np0lij1hzihza2bw6f4v6-python3.11-opentelemetry-exporter-otlp-proto-common-1.21.0/lib/python3.11/site-packages', '/nix/store/mj6zzwj3z5y994xxiz7gzav2sp4q8lqj-python3.11-opentelemetry-proto-1.21.0/lib/python3.11/site-packages', '/nix/store/0zp6zbc3az9dpp0gyqdvcxls44a1qwzc-python3.11-opentelemetry-sdk-1.21.0/lib/python3.11/site-packages', '/nix/store/bq7w2dzcg8w2xdvhzci86szb3hmdmiqn-python3.11-opentelemetry-semantic-conventions-0.42b0/lib/python3.11/site-packages', '/nix/store/9jdv1lvzpr4s4fil8g843phg1zg23xf3-python3.11-overrides-7.4.0/lib/python3.11/site-packages', '/nix/store/0h7dphygxxrj96aw3cmvas7lhl6p0lq8-python3.11-posthog-3.0.2/lib/python3.11/site-packages', '/nix/store/3vyi6d81043v49m7y49mr2nfg2y61zmh-python3.11-monotonic-1.6/lib/python3.11/site-packages', '/nix/store/1c6gp7ipq678q7dipbsa86dmhnlp28fq-python3.11-pulsar-client-3.3.0/lib/python3.11/site-packages', '/nix/store/1xbqkkp5v0kgf1p4v4h0yzapj8rn27mr-python3.11-pypika-0.48.9/lib/python3.11/site-packages', '/nix/store/jbqkzpdk6687v1wp7n886ingx0iv4f8z-python3.11-tenacity-8.2.3/lib/python3.11/site-packages', '/nix/store/6rapmdwx7xkl10sdxj4h4d2dafp1zi27-python3.11-tokenizers-0.15.0/lib/python3.11/site-packages', '/nix/store/ndswj5m4rksx7x943ig2jk5m967kjpx1-python3.11-huggingface-hub-0.19.4/lib/python3.11/site-packages', '/nix/store/primlivzhhlmp2ss52b4hp88bddchnx7-python3.11-filelock-3.13.1/lib/python3.11/site-packages', '/nix/store/wg8j3b0lsjwqgd10xv3d806qzvmsnzfh-python3.11-fsspec-2023.10.0/lib/python3.11/site-packages', '/nix/store/41y4hzk0nfv0p8m8hmjnzp4lgy7801k9-python3.11-aiohttp-3.9.0/lib/python3.11/site-packages', '/nix/store/jr1xwmshkw17ryzvivvgb9g43fn71c50-python3.11-aiosignal-1.3.1/lib/python3.11/site-packages', '/nix/store/mmbb47gix3lr0b8wbrqzlm5ifcaabaw5-python3.11-frozenlist-1.4.0/lib/python3.11/site-packages', '/nix/store/29lrzpqkryrcjvqbv87viik4m3byk8pv-python3.11-attrs-23.1.0/lib/python3.11/site-packages', '/nix/store/50s6g2ra17xrqs8xy96vk6f2a5v6ld6w-python3.11-multidict-6.0.4/lib/python3.11/site-packages', '/nix/store/nmjwgcmxq3yc1n92in44y041jq1l0svr-python3.11-yarl-1.9.2/lib/python3.11/site-packages', '/nix/store/a8gxcv23kqx6w8bzpsrci8ppcrm4id56-python3.11-tqdm-4.66.1/lib/python3.11/site-packages', '/nix/store/6inp1nh9q63jy299sq6za66wy5m995ki-python3.11-typer-0.9.0/lib/python3.11/site-packages', '/nix/store/psb5jx0dii49cahqmwv172x2v8ks220h-python3.11-colorama-0.4.6/lib/python3.11/site-packages', '/nix/store/hah5gf21mxy1zlr3bbhjny3lvawa7rxb-python3.11-rich-13.7.0/lib/python3.11/site-packages', '/nix/store/935krhlk6k8vzimcqvdp7pmng35c6p9k-python3.11-markdown-it-py-3.0.0/lib/python3.11/site-packages', '/nix/store/y26fawfx2c8z3199cdqnk6bz7m3x3yvq-python3.11-mdurl-0.1.2/lib/python3.11/site-packages', '/nix/store/9fipp3in4hz3cdgs8xa5i6fgqwzs5aj3-python3.11-pygments-2.17.1/lib/python3.11/site-packages', '/nix/store/39adm59s7asl0yd947z8zrnk18f8crxf-python3.11-shellingham-1.5.4/lib/python3.11/site-packages', '/nix/store/67mkq2lc2hcysk7f6550hiv8jpbgkv3g-python3.11-injector-0.21.0/lib/python3.11/site-packages', '/nix/store/qyn2zjd6q3231lca64z1dxv04wa6gi1l-python3.11-llama-index-0.9.3/lib/python3.11/site-packages', '/nix/store/h9k3grywn89m4wl5jahk1mx4h6ab1f5a-python3.11-sqlalchemy-2.0.23/lib/python3.11/site-packages', '/nix/store/s8gxxgy7q4g3wycwi00klp19hpsz4mw3-python3.11-greenlet-3.0.1/lib/python3.11/site-packages', '/nix/store/fl5g27jx13pcw1w975vz6w0k8yjx2vqv-python3.11-aiostream-0.5.2/lib/python3.11/site-packages', '/nix/store/xf8pn2fy1czifvh89pgq2nk0cnqhra5j-python3.11-beautifulsoup4-4.12.2/lib/python3.11/site-packages', '/nix/store/m909qj20nlxf6dhl2dmkkggg91fjf290-python3.11-soupsieve-2.5/lib/python3.11/site-packages', '/nix/store/c0kvf39qz08maqlz7n7h8hm7cc6wshp1-python3.11-dataclasses-json-0.5.14/lib/python3.11/site-packages', '/nix/store/dc7rghrrix74x139vgypb3c74ha2q5hy-python3.11-marshmallow-3.20.1/lib/python3.11/site-packages', '/nix/store/bmaxlniwz4f4c03gj0bb6q142cabi1qg-python3.11-typing-inspect-0.9.0/lib/python3.11/site-packages', '/nix/store/jsxlbn9cgym2q0d7d9x2jw83qm5ib95i-python3.11-mypy-extensions-1.0.0/lib/python3.11/site-packages', '/nix/store/rf5i3amyj1r6i9hnh55rdyi1dwnyg96q-python3.11-nest-asyncio-1.5.8/lib/python3.11/site-packages', '/nix/store/2pdx8443ldjmxzbssvhhwzz4vylp6pq4-python3.11-nltk-3.8.1/lib/python3.11/site-packages', '/nix/store/zkllxhigcx9l8zjrlncjgphy47q0hacm-python3.11-joblib-1.3.2/lib/python3.11/site-packages', '/nix/store/a1rqkp2g516qdyxgwxvzkn0j64al1d79-python3.11-regex-2023.10.3/lib/python3.11/site-packages', '/nix/store/pwx0iwp660js5f6zf6cywjsj9nlfxdlj-python3.11-openai-1.3.3/lib/python3.11/site-packages', '/nix/store/qi51kccxf1b6jqqp7cj77g8zf5k677rv-python3.11-distro-1.8.0/lib/python3.11/site-packages', '/nix/store/f3v6zxnj2prag90i98km70psm8yin4yv-python3.11-optimum-1.14.1/lib/python3.11/site-packages', '/nix/store/rpdmgfjgzs3cl39iaqaq8rg3a45zlb9q-python3.11-datasets-2.15.0/lib/python3.11/site-packages', '/nix/store/gw71k8vv79kms0p34336j554c4qbawy3-python3.11-dill-0.3.7/lib/python3.11/site-packages', '/nix/store/2m09gnsycyfba58ni4rcsi68kis7r54a-python3.11-multiprocess-0.70.15/lib/python3.11/site-packages', '/nix/store/498mc4zp9by0facg5lhx7vf3f2an53sf-python3.11-pandas-2.1.3/lib/python3.11/site-packages', '/nix/store/hhqvfndgbkfzk8xvvfr9j4dm2knyysmm-python3.11-pytz-2023.3.post1/lib/python3.11/site-packages', '/nix/store/l06zsqwassmm8js5z9121k1a6jfrsn9z-python3.11-tzdata-2023.3/lib/python3.11/site-packages', '/nix/store/rppls1xwvxqidydj9k79jz71lrv0x4ar-python3.11-pyarrow-14.0.1/lib/python3.11/site-packages', '/nix/store/f96vx5isp1zkynpwhaafq2gq7yniqzld-python3.11-pyarrow-hotfix-0.5/lib/python3.11/site-packages', '/nix/store/2026b94p7jr5k9a3gphs3f616ps743w7-python3.11-xxhash-3.4.1/lib/python3.11/site-packages', '/nix/store/42ywghdw90cdqvrfk40m8lb497sm6qwc-python3.11-evaluate-0.4.1/lib/python3.11/site-packages', '/nix/store/4mbzki1jyxxxml297qih5ww5brx355wx-python3.11-responses-0.18.0/lib/python3.11/site-packages', '/nix/store/cq51xmnc4hrki59nqxrx8y2xdr2q18fl-python3.11-torch-2.1.1/lib/python3.11/site-packages', '/nix/store/ia312s65ib6jz9mfd1ag0bf6frsvigla-python3.11-networkx-3.2.1/lib/python3.11/site-packages', '/nix/store/764bnmbpiq5jqbdqagi70c411zar2ij3-python3.11-nvidia-cublas-cu12-12.1.3.1/lib/python3.11/site-packages', '/nix/store/gv8n1j9cb0kxy1vfq8qfsys9ppcw9r1m-python3.11-nvidia-cuda-cupti-cu12-12.1.105/lib/python3.11/site-packages', '/nix/store/kda8nqvvskckfbp0caxwqx9inwzlfji2-python3.11-nvidia-cuda-nvrtc-cu12-12.1.105/lib/python3.11/site-packages', '/nix/store/a8n1wzlsmrdrhwkzrhbpnysx6avlbkr5-python3.11-nvidia-cuda-runtime-cu12-12.1.105/lib/python3.11/site-packages', '/nix/store/ra3mrp0vwind336q38s42x92n8h6z1av-python3.11-nvidia-cudnn-cu12-8.9.2.26/lib/python3.11/site-packages', '/nix/store/mf9w4vi7r1awy2pvvqb97w4m4swpinpa-python3.11-nvidia-cufft-cu12-11.0.2.54/lib/python3.11/site-packages', '/nix/store/fs00xk0hnj2rgzvv8wb2mkbr33zzn79d-python3.11-nvidia-curand-cu12-10.3.2.106/lib/python3.11/site-packages', '/nix/store/211sqbkvjdlib054z08p56gghdz0svyf-python3.11-nvidia-nccl-cu12-2.18.1/lib/python3.11/site-packages', '/nix/store/x8jmlkqqv45iw47lrv0669amd1n7n1c6-python3.11-nvidia-nvtx-cu12-12.1.105/lib/python3.11/site-packages', '/nix/store/rw7454npa7x9rrxcp440yikrh1xvd7zx-python3.11-triton-2.1.0/lib/python3.11/site-packages', '/nix/store/86cb9yq9bjazz03zx07b627ii0d1by3p-python3.11-transformers-4.35.2/lib/python3.11/site-packages', '/nix/store/dcw694zfgn539bm2vaspap69x832b0j7-python3.11-accelerate-0.24.1/lib/python3.11/site-packages', '/nix/store/x0xrhvqmyv5ncmg5l760qqhzy7hr2p4b-python3.11-psutil-5.9.6/lib/python3.11/site-packages', '/nix/store/mwp6zfiydla4lpqsdj7vnlvn155bgdx0-python3.11-safetensors-0.3.3/lib/python3.11/site-packages', '/nix/store/a1xgbdklrph33knskvns4b958v8hk5fr-python3.11-sentencepiece-0.1.99/lib/python3.11/site-packages', '/nix/store/d1ax823vh518zr1pnn0d2yg3hdaj1jzn-python3.11-tiktoken-0.5.1/lib/python3.11/site-packages', '/nix/store/y8yq448xka665bqyd5gvdhajqdbsnyys-python3.11-setuptools-68.2.2/lib/python3.11/site-packages', '/nix/store/advxvggm3hjwbcfydwa3bi6861938qwf-python3.11-pypdf-3.17.1/lib/python3.11/site-packages', '/nix/store/wd9rsycgkj1iyldjw832j344wk2zakdn-python3.11-qdrant-client-1.6.9/lib/python3.11/site-packages', '/nix/store/rk8acyasiwk9mipp0qgkjw14d9ywf6fs-python3.11-grpcio-tools-1.59.3/lib/python3.11/site-packages', '/nix/store/cwv1k98f0hw0g755dqm4r89x2j4g27bn-python3.11-portalocker-2.8.2/lib/python3.11/site-packages', '/nix/store/bj161fjjq503pl85kck23z9l2b2qrzzh-python3.11-watchdog-3.0.0/lib/python3.11/site-packages', '/nix/store/k69b4iw3r2imdrss5qi6cy1r25kp33k5-python3.11-black-22.12.0/lib/python3.11/site-packages', '/nix/store/jx8cr3n6ssamd27zlj593m8hfxdffjqb-python3.11-pathspec-0.11.2/lib/python3.11/site-packages', '/nix/store/9gr7kb390xg8zl1km0g3bp53aj4xrid8-python3.11-platformdirs-3.11.0/lib/python3.11/site-packages', '/nix/store/7vzkdi8s02h1pipbkyqqd7bhf3d9gvhl-python3.11-mypy-1.7.0/lib/python3.11/site-packages', '/nix/store/c4hzc7cja1nkf9aci6vsy43mb7arpn70-python3.11-pre-commit-2.21.0/lib/python3.11/site-packages', '/nix/store/9il100vz9859ldcy3xpg78vk0vnwc07w-python3.11-cfgv-3.4.0/lib/python3.11/site-packages', '/nix/store/16j9bchfhrfmwchh32iccaid5yckqj4g-python3.11-identify-2.5.32/lib/python3.11/site-packages', '/nix/store/qjljvm1hj629b8pnyx04nb6yiac7nn5i-python3.11-nodeenv-1.8.0/lib/python3.11/site-packages', '/nix/store/v7h4q2ldfaq5zb8fxx9lgb5c0wxvjyms-python3.11-virtualenv-20.24.6/lib/python3.11/site-packages', '/nix/store/mh7416sm2bpacvm7zd1w9w88h2672grr-python3.11-distlib-0.3.7/lib/python3.11/site-packages', '/nix/store/hv57g8r3bzy9mplrnfyg44ghzcxgmdmq-python3.11-pytest-7.4.3/lib/python3.11/site-packages', '/nix/store/mvc1azi7cifkj72f1bbf7bg38d6kkjg0-python3.11-iniconfig-2.0.0/lib/python3.11/site-packages', '/nix/store/wmad2pifbh3bwkqgjhc15sfk88hf5a44-python3.11-pluggy-1.3.0/lib/python3.11/site-packages', '/nix/store/96xw7jflwd3hp06gs3fzqfqdfm8djwmh-python3.11-pytest-asyncio-0.21.1/lib/python3.11/site-packages', '/nix/store/qvccbysdqm1gz77sp786fnl2g0dy33ni-python3.11-pytest-cov-3.0.0/lib/python3.11/site-packages', '/nix/store/vxg1qvq1imw1yrm28zi2f88y9hdgfzam-python3.11-coverage-7.3.2/lib/python3.11/site-packages', '/nix/store/frxpbr520swl36mna02kj0pdgm339qjh-python3.11-ruff-0.1.6/lib/python3.11/site-packages', '/nix/store/8d5fvkzwbcfx2hyb7nlxlf0740wgf4k3-python3.11-types-pyyaml-6.0.12.12/lib/python3.11/site-packages', '/nix/store/52m7m43j93dsihmanpmvzdn7vbbjp4mz-python3.11-llama-cpp-python-0.2.18/lib/python3.11/site-packages', '/nix/store/qm2ld7w4ydrz642jqs9fckv7hq60vd6d-python3.11-diskcache-5.6.3/lib/python3.11/site-packages', '/nix/store/ralbaclz3pqj7rpwfd6bg4d9ds8s8ixf-python3.11-scikit-build-core-0.5.1/lib/python3.11/site-packages', '/nix/store/z3zlixmimjjy11gf760v69hbgx5sas8c-python3.11-pyproject-metadata-0.7.1/lib/python3.11/site-packages', '/nix/store/frdzqma21jr8n4l7x3gplvh8c9bqsjd7-python3.11-sentence-transformers-2.2.2/lib/python3.11/site-packages', '/nix/store/m218rxkcvj0f2ncfl9ffbg4knii727rs-python3.11-scikit-learn-1.3.2/lib/python3.11/site-packages', '/nix/store/kwd5gcgc8zzhx40vxfjp9aiq5kzy24zg-python3.11-scipy-1.11.4/lib/python3.11/site-packages', '/nix/store/p37ndb136ja65fykpssfiydflpxbwhmm-python3.11-pybind11-2.11.1/lib/python3.11/site-packages', '/nix/store/ygzd4ck0mkr2rj5aamfpxa5vmn039vgb-python3.11-threadpoolctl-3.2.0/lib/python3.11/site-packages', '/nix/store/8zmchs5dqf9phwvfqnx4dyynilvsvk00-python3.11-torchvision-0.16.1/lib/python3.11/site-packages', '/nix/store/mqzsv8wma5dvwpwlk8a9naci407b3bvw-python3.11-pillow-10.1.0/lib/python3.11/site-packages', '/nix/store/4m635884lgcf37mv7nqpvkkh48z42bk8-python3.11-gradio-4.4.1/lib/python3.11/site-packages', '/nix/store/mh2cmpgj58rli42nqq537pdj6p9rhzgx-python3.11-aiofiles-23.2.1/lib/python3.11/site-packages', '/nix/store/hn0yqy9kq5dgyinm797s235vp6bi64dz-python3.11-altair-5.1.2/lib/python3.11/site-packages', '/nix/store/8pq66jiq3pcxpfs02cixij8jgld5ryb9-python3.11-jsonschema-4.20.0/lib/python3.11/site-packages', '/nix/store/grqd7y5rrzqp50wvrhmrvvapb8qhcn3g-python3.11-jsonschema-specifications-2023.11.1/lib/python3.11/site-packages', '/nix/store/hnbdbl0inx889hkq2p3xqv8pcpwxsc4z-python3.11-referencing-0.31.0/lib/python3.11/site-packages', '/nix/store/idnm37agcjv5khnzs7f6328qdpp9bsf4-python3.11-rpds-py-0.13.0/lib/python3.11/site-packages', '/nix/store/dxl4dam26k2zhw7ffamlx89c6pfzq80s-python3.11-toolz-0.12.0/lib/python3.11/site-packages', '/nix/store/zd1b33nk2b1hjgwsx9v6lgg0nrj62z6g-python3.11-ffmpy-0.3.1/lib/python3.11/site-packages', '/nix/store/01pz0wwsslbj2pfchx9w0n3srshilspf-python3.11-gradio-client-0.7.0/lib/python3.11/site-packages', '/nix/store/q0lmqszn47ljy964g899w1nklisycr1q-python3.11-matplotlib-3.8.2/lib/python3.11/site-packages', '/nix/store/a26n3aik25jlfbj57q1nzqhp5m33kqr3-python3.11-contourpy-1.2.0/lib/python3.11/site-packages', '/nix/store/230bm4r17hd7bymf9fk5k9r2wnawn64c-python3.11-cycler-0.12.1/lib/python3.11/site-packages', '/nix/store/m3jmx43xdbhgjrr3flvhlzcivmjfp2bf-python3.11-fonttools-4.44.3/lib/python3.11/site-packages', '/nix/store/3fcf23mhkacl0cl4j6cmw9m29h9ihn39-python3.11-kiwisolver-1.4.5/lib/python3.11/site-packages', '/nix/store/3c0fyvsy2ls0j5nzqji32w929dsrdacr-python3.11-pyparsing-3.1.1/lib/python3.11/site-packages', '/nix/store/s5brxpy5p00xrb3ymwabsc1v6j2550ii-python3.11-pydub-0.25.1/lib/python3.11/site-packages', '/nix/store/p4cjfqc4xj5rlp87skg1cn5h99pnjaln-python3.11-semantic-version-2.10.0/lib/python3.11/site-packages', '/nix/store/73vmdfj0j01ym5wvvx7r14pbkd23pzzf-python3.11-tomlkit-0.12.0/lib/python3.11/site-packages', '/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python311.zip', '/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11', '/nix/store/qp5zys77biz7imbk6yy85q5pdv7qk84j-python3-3.11.6/lib/python3.11/lib-dynload']
Is there a way we can make it nix run
able? I don't think most users will be as persistent 😅
github:MatthewCroughan/privateGPT
worked for me after I run python scripts/setup
.
Seems like privateGPT can only run a subset of GGUF models though, and its not clear which until you try them. But I do appreciate its better to have the dependencies specified properly.