/CameLLM

Run your favourite LLMs locally on macOS from Swift

Primary LanguageSwift

CameLLM is a collection of Swift packages to support running LLMs (including the LLaMA family and GPT-J models) locally on macOS (and hopefully in the future, iOS) with modern, clean, Swift bindings 🚀

🐪 This repository

CameLLM is plugin-based and this is the main CameLLM repo, which exposes shared and abstract types which can be implemented by plugins and used by clients of the CameLLM family of packages.

⚠️ Status

This project is still under development, but is being built to power version 2 of LlamaChat.

Packages

The core of CameLLM is implemented through several main packages:

  • CameLLM/CameLLM (this repo): Provides shared and abstract types used by implementing plugins and clients of CameLLM.
  • CameLLM/CameLLM-Common: Provides shared classes in both Swift and Objective-C++ which can be used by implementing plugins.
  • CameLLM/CameLLM-Plugin-Harness: Provides (mostly) abstract types which can be used by implementing plugins, as well as some shared classes which can be utilised by plugins.

Current CameLLM plugins:

🏛️ Architecture

All of the CameLLM packages are available through the Swift Package Manager.

Since most work to date which supports the running of LLMs locally on macOS and iOS has been done by llama.cpp and ggml which are written in C and C++, most of the CameLLM packages have both Swift and Objective-C++ packages, named <PackageName> and <PackageName>ObjCxx respectively.