/ollama-nix

CUDA-enabled ollama nix flake

Primary LanguageNixGNU Affero General Public License v3.0AGPL-3.0

As of March 2024, ollama in nixpkgs manages to handle CUDA mess on its own.
A sensible nixos module is available too! There is no need to use this
flake anymore.

Original README content:
As of 12 Nov 2023, ollama in nixpkgs fails to utilize CUDA-enabled devices.

This repo contains a nix flake that defines:
- ollama package with CUDA support
- NixOS module that may be helpful if you plan on using ollama as a system-wide
  LLM runner

To simply run the CUDA-enabled ollama binary, make sure you have nvidia-smi
available in the PATH and run the following command:
nix run github:havaker/ollama-nix