Backend implementation for building workflow with natural language
- Github repository: https://github.com/ai-zerolab/floword/
- (WIP)Documentation https://ai-zerolab.github.io/floword/
We recommend using uv to manage your environment.
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh # For macOS/Linux
# or
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" # For WindowsThen you can use uvx floword@latest strat as commands for running the floword server.
Docker is also supported. You can use docker pull ghcr.io/ai-zerolab/floword:latest to pull the image from Github Container Registry.
(WIP) You can found deployment instructions in deploy directory.
You can use .env file or environment variables to configure floword. All environment variables should be prefixed with FLOWORD_ (case-insensitive).
Available options:
FLOWORD_JWT_SECRET_TOKEN: Secret token for JWT authentication. Default:NoneFLOWORD_ALLOW_ANONYMOUS: Allow anonymous access. Default:True
FLOWORD_SQLITE_FILE_PATH: Path to SQLite database file. Default:./floword.sqlite(in current working directory)FLOWORD_USE_POSTGRES: Use PostgreSQL instead of SQLite. Default:FalseFLOWORD_PG_USER: PostgreSQL username. Default:postgresFLOWORD_PG_PASSWORD: PostgreSQL password. Default:postgresFLOWORD_PG_HOST: PostgreSQL host. Default:localhostFLOWORD_PG_PORT: PostgreSQL port. Default:5432FLOWORD_PG_DATABASE: PostgreSQL database name. Default:floword
FLOWORD_REDIS_URL: Redis URL for streaming messages in distributed mode. Default:None
FLOWORD_DEFAULT_MODEL_PROVIDER: Default LLM provider. Default:openaiFLOWORD_DEFAULT_MODEL_NAME: Default model name. Default:NoneFLOWORD_DEFAULT_MODEL_KWARGS: Additional arguments for the model (as JSON string). Default:NoneFLOWORD_DEFAULT_CONVERSATION_SYSTEM_PROMPT: Default system prompt for conversations. Default: Content fromfloword/prompts/system-conversation.mdFLOWORD_DEFAULT_WORKFLOW_SYSTEM_PROMPT: Default system prompt for workflows. Default: Content fromfloword/prompts/system-workflow.md
Use FLOWORD_MCP_CONFIG_PATH to specify the path to the MCP configuration file. Default: ./mcp.json (in current working directory)
The MCP configuration file should be a json file with the following structure:
{
"mcpServers": {
"zerolab-toolbox": {
"args": ["mcp-toolbox@latest", "stdio"],
"command": "uvx",
"env": {
"FIGMA_API_KEY": "your-figma-api-key"
}
},
"sse-server": {
"url": "http://localhost:8000",
"headers": {},
"timeout": 5,
"sse_read_timeout": 300
}
}
}Fork the repository and clone it to your local machine.
# Install in development mode
make install
# Activate a virtual environment
source .venv/bin/activate # For macOS/Linux
# or
.venv\Scripts\activate # For Windowsmake testmake checkmake docsContributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the terms of the license included in the repository.