Uses OgbujiPT (language AI toolkit) to Help a user manage their bookmarks in context of various chat, etc.
Prerequisites:
- Account on Raindrop
- Integration app you've set up on the Raindrop site, and its API token
- Discord app and token
- PGVector instance running
- llama.cpp server running an LLM model
- Python 3.10 or newer
Go to the integrations page & select or create an app
Do Create test token and copy the token provided, which will only give you access to bookmarks within your own account. Save this token into your environment or password manager. You can specify the living_bookmarks
token on the command line (--raindrop-key
) or environment (LIVING_BOOKMARKS_RAINDROP_KEY
)
You need a discord app (bot) set up. The discord.py docs has a useful primer on this. Save the token to your password manager.
You can specify the token on the command line (--discord-token
) or environment (LIVING_BOOKMARKS_DISCORD_TOKEN
)
This is easiest with Docker. You can just do:
# Replace the following with your preferred shell's way of updating environment
export $DB_HOST="localhost"
export $DB_PORT="5432"
export $DB_USER="me"
export $DB_PASSWORD="my_secret_secret"
export $DB_NAME="my_embeddings"
docker pull ankane/pgvector
docker run --name mydb -p 5432:5432 \
-e POSTGRES_USER=$DB_USER -e POSTGRES_PASSWORD=$DB_PASSWORD -e POSTGRES_DB=$DB_NAME \
-d ankane/pgvector
Make sure you don't have anything running on port 5432, or update the port in the environment and in the command.
For now you probably have to download, build and run the llama.cpp server
TODO: Hosted Docker image for llama.cpp running Phi-2-super GGUF
pip install -Ur requirements.txt --constraint=constraints.txt
Copy config.default.toml
to config.toml
, which you can tweak to taste and setup.
Run the program. Note the following assumes you have LIVING_BOOKMARKS_RAINDROP_KEY
and LIVING_BOOKMARKS_DISCORD_TOKEN
in the environment, so they're not specified in cmdline.
python launch.py --config config.toml