/FreedomGPT

This codebase is for a React and Electron-based app that executes the FreedomGPT LLM locally (offline and private) on Mac and Windows using a chat-based interface (based on Alpaca Lora)

Primary LanguageTypeScriptGNU General Public License v3.0GPL-3.0

Freedom GPT

GitHub license

GitHub release

GitHub stars

GitHub All Releases

Join our Discord Community

Join our Discord Server to get the latest updates and to interact with the community.

Discord

Introduction

This is the repository for the Freedom GPT application. This application is built using Electron and React. It is a desktop application that allows users to run alpaca models on their local machine.

Prerequisites

If you want to run the project

git clone --recursive https://github.com/ohmplatform/FreedomGPT.git freedom-gpt
cd freedom-gpt
yarn install
yarn start:prod

If you want to contribute to the project

Working with the repository

git clone --recursive https://github.com/ohmplatform/FreedomGPT.git freedom-gpt
cd freedom-gpt
yarn install

Building the llama.cpp library

Building from Source (MacOS/Linux)

cd llama.cpp
make

Building from Source (Windows)

cd llama.cpp
cmake .
cmake --build . --config Release
  • You should now have a Release folder with a main.exe file inside it. You can run this file to test the chat client.

Changing the API URL

We are using http://localhost:8889 as the API URL, you can change it in the file src/index.ts

Running the application

To run the application, run the following command in your terminal:

yarn start

⦻ Make sure you are in the root directory of the project.

Dockerizing the application

To run the docker image, run the following command in your terminal:

docker pull freedomgpt/freedomgpt
docker run -d -p 8889:8889 freedomgpt/freedomgpt

If you want to build the docker image yourself, run the following command in your terminal:

docker build -t freedomgpt/freedomgpt .

OR

yarn docker

Working Video

Screen.Recording.2023-04-22.at.8.41.01.AM.mov

Credits

This project utilizes several open-source packages and libraries, without which this project would not have been possible:

"llama.cpp" - C++ library. https://github.com/ggerganov/llama.cpp

"LLAMA" by Facebook Research - a low-latency, large-scale approximate nearest neighbor search algorithm. https://github.com/facebookresearch/llama

"Alpaca" by Stanford CRFM - a framework for understanding and improving the efficiency and robustness of algorithms. https://crfm.stanford.edu/2023/03/13/alpaca.html

"alpaca-lora" by tloen - a Python library for working with LoRa radios and the Alpaca protocol. https://github.com/tloen/alpaca-lora

We would like to express our gratitude to the developers of these packages and their contributors for making their work available to the public under open source licenses. Their contributions have enabled us to build a more robust and efficient project.

LICENSE

See the LICENSE file.