Bugfix: Redo LocalLLM option
Closed this issue · 0 comments
rmusser01 commented
Title.
Currently doesn't work/uses non-functioning code.
Rewrite the flow and set it to use shell callouts, asking the user to close the launched llamafile when done.
Closed this issue · 0 comments
Title.
Currently doesn't work/uses non-functioning code.
Rewrite the flow and set it to use shell callouts, asking the user to close the launched llamafile when done.