cannot load model
loekTheDreamer opened this issue · 3 comments
loekTheDreamer commented
issue in the README regarding model loading. It mentions the 'gguf' model but lacks clear instructions. is file loading implemented yet? No model found is always result.
ramyareye commented
@loekTheDreamer I tried with downloading with https://lmstudio.ai/
loekTheDreamer commented
when using the exmple project the models load fine.
But when adding the file to the project directory and using RNFS the error is always No model found :
initLlama({
model: `${RNFS.MainBundlePath}/gp2-q2_k.gguf`,
// /Users/doris/Library/Developer/CoreSimulator/Devices/FB284989-91D5-4526-BC32-3B6CB88D84A7/data/Containers/Bundle/Application/70FB6F7D-3F5A-45DC-B21F-FEAEB2870926/Thailate.app/gp2-q2_k.gguf
use_mlock: true,
n_gpu_layers: Platform.OS === 'ios' ? 0 : 0, // > 0: enable GPU
// embedding: true,
})