A simple repo for creating and managing ollama models for various applications.
Table of Contents
To get started with ollama-models, simply clone this repository and follow the instructions in the Installation
section below.
- Install Ollama:
https://ollama.com/download
- Clone this repository to your local machine:
git clone https://gitlab.internal.equipmentshare.com/don.schartman/ollama-models
- Navigate to the project directory:
cd ollama-models
- Create and manage custom models for various applications
- Customize models with parameters and prompts
- Run and test models locally
Model files (e.g., notes.modelfile
, explorer.modelfile
) are used to define the behavior of Ollama models. They contain specific instructions and parameters that determine how the model generates output.
To create a new model file, use the ollama create
command followed by the desired filename (e.g., ollama create notes -f ./notes.modelfile
). You can then edit the contents of the file to define the model's behavior.
Model parameters are used to customize the behavior of Ollama models. Some common parameters include:
temperature
: Controls the level of creativity in the generated output.num_ctx
: Sets the number of context tokens to use when generating output.
System messages (e.g., SYSTEM
) are used to define specific interactions with Ollama models. They allow you to inject custom behavior into the model and tailor it to your needs.
- Create a new model:
ollama create <model_name> -f ./modelfile
- Pull an existing model:
ollama pull <model_name>
- Remove a model:
ollama rm <model_name>
- Run a model:
ollama run <model_name>
ollama pull llama3.1:70b
ollama create notes -f ./notes.modelfile
ollama create explorer -f ./explorer.modelfile
ollama create refine -f ./refine.modelfile
ollama create main -f ./main.modelfile
ollama create planner -f ./planner.modelfile
- Create a custom language model for text generation
- Develop a personalized chatbot with a specific personality
- Train a model for image classification and fine-tune it for your own use case