could I load the GPTQ-for-SantaCoder/starcoder-GPTQ-8bit-128g model?
heber opened this issue · 1 comments
could I load the GPTQ-for-SantaCoder/starcoder-GPTQ-8bit-128g model offline?
when i use the command "python main.py --pretrained {the path to starcoder-GPTQ-8bit-128g}, there is a error:
File ... transformers/pipelines/init.py", line 779, in pipeline
framework, model = infer_framework_load_model(
File ...transformers/pipelines/base.py , line 271, in infer_framework_load_model
raise ValueError(f"Could not load model {model} with any of the following classes: {class tuple}.")
ValueError: Could not load model starcoder-GPTQ-8bit-128g with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt2.modeling_gpt2.GPT2LMHeadModel'>).
I couldn't find any configuration file under mayank31398/starcoder-GPTQ-8bit-128g, I thought this model couldn't be loaded directly through transformers
.