0cc4m/KoboldAI

Issue with loading 30b model which was previously good

szarkab123 opened this issue · 1 comments

Win 10, 4090, metalx/alpasta 30B. 60 GPU layer

i had this OOM problem before then it was fixed on model-structure-update branch
then updated to latestqptq and this error came

previously good commit hash: 4180620
bad commit hash: a2d01bb

error msg:
https://pastebin.com/kTRzHP9z

0cc4m commented

Please try again with latest version.