0cc4m/KoboldAI

i cannot load any ai models and i keep getting this error no matter what i do. this happened after i did "git pull" command from this repository

0xYc0d0ne opened this issue · 1 comments

Exception in thread Thread-14:
Traceback (most recent call last):
File "B:\python\lib\threading.py", line 932, in _bootstrap_inner
self.run()
File "B:\python\lib\threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "B:\python\lib\site-packages\socketio\server.py", line 731, in _handle_event_internal
r = server._trigger_event(data[0], namespace, sid, *data[1:])
File "B:\python\lib\site-packages\socketio\server.py", line 756, in trigger_event
return self.handlers[namespace]event
File "B:\python\lib\site-packages\flask_socketio_init
.py", line 282, in _handler
return self.handle_event(handler, message, namespace, sid,
File "B:\python\lib\site-packages\flask_socketio_init
.py", line 828, in _handle_event
ret = handler(*args)
File "aiserver.py", line 615, in g
return f(*a, **k)
File "aiserver.py", line 3191, in get_message
load_model(use_gpu=msg['use_gpu'], gpu_layers=msg['gpu_layers'], disk_layers=msg['disk_layers'], online_model=msg['online_model'])
File "aiserver.py", line 1980, in load_model
model.load(
File "C:\KoboldAI\modeling\inference_model.py", line 177, in load
self._load(save_model=save_model, initial_load=initial_load)
File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 198, in _load
self.model = self._get_model(self.get_local_model_path(), tf_kwargs)
File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 378, in _get_model
model = load_quant_offload(llama_load_quant, utils.koboldai_vars.custmodpth, path_4bit, utils.koboldai_vars.gptq_bits, groupsize, self.gpu_layers_list, force_bias=v2_bias)
TypeError: load_quant_offload() got an unexpected keyword argument 'force_bias'

0cc4m commented

You need to update the gptq module. Run install_requirements again.