Repo of the code from the Medium article - Build a powerful LLM API right on your computer
This Project has 3 parts:
- Create your first FastAPI and interact with it
- Create a Streamlit AI app where you use TinyLlama-1B-OpenOrca as Instruction AI you can reach in your Local Network
- Use llama-cpp-python built in API and Streamlit to give your Team a nice Chatbot (coming soon0
- Create your LLM API: your ChatBOT as a service — part 1
- Create your LLM API: your ChatBOT as a service — part 2
- Create your LLM API: ChatBOT as a service — part 3
This is the python file for the textual interface as described in the part 3 article
The result will be as shown below (terminal server llama-cpp-python on the left, chat interface on the right):