/ollama-llama-3

Work with Ollama and Llama models

Primary LanguageJupyter Notebook

Ollama / Llama3

Work with (local) Ollama and Llama large language models - but also other models supported by Ollama like Mistral or Phi (https://ollama.com/library)


Ollama and Llama3 — A Streamlit App to convert your files into local Vector Stores and chat with them using the latest LLMs

https://medium.com/p/c5340fcd6ad0

A Streamlit App to convert your files into Vector Stores and chat with them with LLMs

https://github.com/ml-score/ollama/tree/main/script

Overview Ollama Streamlit App


Llama3 and KNIME - Build your local Vector Store from PDFs and other Documents

runs on KNIME 4 and Python

https://medium.com/p/237eda761c1c

Chat with local Llama 3 Model via Ollama in KNIME Analytics Platform

Also extract Logs into structured JSON Files

https://medium.com/p/aca61e4a690a

Creating a Local LLM Vector Store from PDFs with KNIME and GPT4All

https://medium.com/p/311bf61dd20e

KNIME, AI Extension and local Large Language Models (LLM)

https://medium.com/p/cef650fc142b


In the subfolder /notebooks/ you will find sample code to work with local large language models and you own files

  • Ollama - Chat with your Logs.ipynb
  • Ollama - Chat with your PDF.ipynb
  • Ollama - Chat with your Unstructured CSVs.ipynb
  • Ollama - Chat with your Unstructured Log Files.ipynb
  • Ollama - Chat with your Unstructured Text Files.ipynb

You can find an example of how to use these code within KNIME to chat or process with your files in this KNIME Workflow (maybe best to download the whole workflow group):

https://hub.knime.com/-/spaces/-/~5s39Yth4NbkUIj0q/current-state/


More articles that might be intersting: