morph-labs/rift

How do you try other models?

sytelus opened this issue · 2 comments

Is there any instructions on how one can change to using other models for code completion? The model would have same interface as replit-code-1.3b so I am hoping change should be easy.

Hi @sytelus, if there's a custom model for one of the currently supported backends (e.g. HF or GPT4All) that you'd like to try, you can edit the settings.json file directly (accessible through the VSCode settings tab) to set rift.chatModel or rift.codeEditModel to your custom model.

For example, we're about to add support for GGML models and the syntax for that in the settings.json file looks like this:

{
    "rift.codeEditModel": "llama:codellama-7b-instruct @ /Users/pv/Downloads/CodeLlama-7B-Instruct-GGUF/codellama-7b-instruct.Q5_K_M.gguf"
}

Hi @sytelus, if there's a custom model for one of the currently supported backends (e.g. HF or GPT4All) that you'd like to try, you can edit the settings.json file directly (accessible through the VSCode settings tab) to set rift.chatModel or rift.codeEditModel to your custom model.

For example, we're about to add support for GGML models and the syntax for that in the settings.json file looks like this:

{
    "rift.codeEditModel": "llama:codellama-7b-instruct @ /Users/pv/Downloads/CodeLlama-7B-Instruct-GGUF/codellama-7b-instruct.Q5_K_M.gguf"
}

Hey thanks! The model I wish to try out is not instruct tuned and also not quantized (think of replit 1.3B like model). Is rift.codeEditModel the right setting? Is there any other settings to accommodate non-instruct tuned models (for example, different prompts?)? Also, are there any settings to accommodate non-FIM models?