Connect realtime data to your AI
Pacha is an AI tool that does retrieval of context for a natural language query using a SQL interface. Pacha is specially built to work with Hasura DDN for authorized multi-source querying.
You would use Pacha with your favourite LLMs to generate grounded responses in your AI apps/agents/chatbots.
- Atleast Python version 3.12
- Access to OpenAI or Anthropic.
- A postgres database you want to try Pacha out on.
- Install Poetry
- Run
poetry install
to install Python dependencies.
Note: You can skip this step if running Pacha directly against Postgres instead of Hasura DDN.
- Create a Hasura account at hasura.io/ddn
- Scaffold a local Hasura setup on a postgres database like this:
poetry run ddn_setup -c <postgres connection string> --dir ddn_project
- The above generated metadata is where you would configure row / column access control rules for your data.
- Start a local Hasura engine with:
docker compose -f ddn_project/docker-compose.hasura.yaml up -d
examples/chat_with_tool.py
is a CLI chat interface that uses Pacha with Anthropic.
ANTHROPIC_API_KEY=<api-key> poetry run chat_with_anthropic -d ddn -u <DDN SQL URL> -H <header to pass to DDN>
Example:
ANTHROPIC_API_KEY=<api-key> poetry run chat_with_anthropic -d ddn -u http://localhost:3000/v1/sql -H 'x-hasura-role: admin'
You can also run Pacha with OpenAI:
OPENAI_API_KEY=<api-key> poetry run chat_with_openai -d ddn -u <DDN SQL URL> -H <header to pass to DDN>
If you want to run against a custom SQL backend that's not Hasura DDN or Postgres, you can implement the DataEngine
class in pacha/data_engine
, and pass that to the Pacha SDK. See usage here.
You can see example Postgres implementation of DataEngine is in pacha/data_engine/postgres.py
.
You can run Pacha against any LLM that supports function/tool calling. You can see the examples in examples/chat_with_tool_anthropic.py
and examples/chat_with_tool_openai.py
.