mudler/LocalAI

Deprecation of old backends

mudler opened this issue · 0 comments

Is your feature request related to a problem? Please describe.
There are several backends that would be legacy by now, as llama.cpp enhanced support for different architectures via ggml over time.

Some of them include falcon-ggml and dolly for instance.

This card is about removing support for old backends, not for removing support family (for instance, starcoder is supported by llama.cpp, so no need to have a starcoder backend based out of ggml).

Tracked in #1126