support native llama2
mcollina opened this issue · 0 comments
mcollina commented
https://github.com/withcatai/node-llama-cpp
(This module should be a devDependency and loaded/installed at runtime if selected)
mcollina opened this issue · 0 comments
https://github.com/withcatai/node-llama-cpp
(This module should be a devDependency and loaded/installed at runtime if selected)