/AIForge

Primary LanguagePythonMIT LicenseMIT

⛓️ AIForge

Discover a simpler & smarter way to train and deploy any AI Model

📦 Installation

Locally

You can install AIForge from pip:

# This installs the package without dependencies for local models
pip install aiforge

You can still use models from projects like LocalAI, Ollama, LM Studio, Jan and others.

Next, run:

python -m aiforge

or

aiforge run # or aiforge --help

🖥️ Command Line Interface (CLI)

AIForge provides a command-line interface (CLI) for easy management and configuration.

Usage

You can run the AIForge using the following command:

aiforge run [OPTIONS]

Each option is detailed below:

  • --help: Displays all available options.
  • --host: Defines the host to bind the server to. Can be set using the LANGFLOW_HOST environment variable. The default is 127.0.0.1.
  • --workers: Sets the number of worker processes. Can be set using the LANGFLOW_WORKERS environment variable. The default is 1.
  • --timeout: Sets the worker timeout in seconds. The default is 60.
  • --port: Sets the port to listen on. Can be set using the LANGFLOW_PORT environment variable. The default is 7860.
  • --config: Defines the path to the configuration file. The default is config.yaml.
  • --env-file: Specifies the path to the .env file containing environment variables. The default is .env.
  • --log-level: Defines the logging level. Can be set using the LANGFLOW_LOG_LEVEL environment variable. The default is critical.
  • --components-path: Specifies the path to the directory containing custom components. Can be set using the LANGFLOW_COMPONENTS_PATH environment variable. The default is langflow/components.
  • --log-file: Specifies the path to the log file. Can be set using the LANGFLOW_LOG_FILE environment variable. The default is logs/langflow.log.
  • --cache: Selects the type of cache to use. Options are InMemoryCache and SQLiteCache. Can be set using the LANGFLOW_LANGCHAIN_CACHE environment variable. The default is SQLiteCache.
  • --dev/--no-dev: Toggles the development mode. The default is no-dev.
  • --path: Specifies the path to the frontend directory containing build files. This option is for development purposes only. Can be set using the LANGFLOW_FRONTEND_PATH environment variable.
  • --open-browser/--no-open-browser: Toggles the option to open the browser after starting the server. Can be set using the LANGFLOW_OPEN_BROWSER environment variable. The default is open-browser.
  • --remove-api-keys/--no-remove-api-keys: Toggles the option to remove API keys from the projects saved in the database. Can be set using the LANGFLOW_REMOVE_API_KEYS environment variable. The default is no-remove-api-keys.
  • --install-completion [bash|zsh|fish|powershell|pwsh]: Installs completion for the specified shell.
  • --show-completion [bash|zsh|fish|powershell|pwsh]: Shows completion for the specified shell, allowing you to copy it or customize the installation.
  • --backend-only: This parameter, with a default value of False, allows running only the backend server without the frontend. It can also be set using the LANGFLOW_BACKEND_ONLY environment variable.
  • --store: This parameter, with a default value of True, enables the store features, use --no-store to deactivate it. It can be configured using the LANGFLOW_STORE environment variable.

These parameters are important for users who need to customize the behavior of Langflow, especially in development or specialized deployment scenarios.

Environment Variables

You can configure many of the CLI options using environment variables. These can be exported in your operating system or added to a .env file and loaded using the --env-file option.

A sample .env file named .env.example is included with the project. Copy this file to a new file named .env and replace the example values with your actual settings. If you're setting values in both your OS and the .env file, the .env settings will take precedence.

🎨 Creating Flows

Creating flows with AIForgeflow is easy. Simply drag components from the sidebar onto the canvas and connect them to start building your application.

Explore by editing prompt parameters, grouping components into a single high-level component, and building your own Custom Components.

Once you’re done, you can export your flow as a JSON file.

Load the flow with:

from aiforge import load_flow_from_json

flow = load_flow_from_json("path/to/flow.json")
# Now you can use it
flow("Hey, have you heard of AIForge?")

📄 License

AIForge is released under the MIT License. See the LICENSE file for details.