/clientai

A unified client for AI providers with built-in agent support.

Primary LanguagePythonMIT LicenseMIT

ClientAI

ClientAI logo

A unified client for AI providers with built-in agent support.

Tests PyPi Version Supported Python Versions


ClientAI is a Python package that provides a unified framework for building AI applications, from direct provider interactions to transparent LLM-powered agents, with seamless support for OpenAI, Replicate, Groq and Ollama.

Documentation: igorbenav.github.io/clientai/


Features

  • Unified Interface: Consistent methods across multiple AI providers (OpenAI, Replicate, Groq, Ollama).
  • Streaming Support: Real-time response streaming and chat capabilities.
  • Intelligent Agents: Framework for building transparent, multi-step LLM workflows with tool integration.
  • Modular Design: Use components independently, from simple provider wrappers to complete agent systems.
  • Type Safety: Comprehensive type hints for better development experience.

Installing

To install ClientAI with all providers, run:

pip install "clientai[all]"

Or, if you prefer to install only specific providers:

pip install "clientai[openai]"  # For OpenAI support
pip install "clientai[replicate]"  # For Replicate support
pip install "clientai[ollama]"  # For Ollama support
pip install "clientai[groq]"  # For Groq support

Quick Start Examples

Basic Provider Usage

from clientai import ClientAI

# Initialize with OpenAI
client = ClientAI('openai', api_key="your-openai-key")

# Generate text
response = client.generate_text(
    "Tell me a joke",
    model="gpt-3.5-turbo",
)
print(response)

# Chat functionality
messages = [
    {"role": "user", "content": "What is the capital of France?"},
    {"role": "assistant", "content": "Paris."},
    {"role": "user", "content": "What is its population?"}
]

response = client.chat(
    messages,
    model="gpt-3.5-turbo",
)
print(response)

Quick-Start Agent

from clientai import client
from clientai.agent import create_agent, tool

@tool(name="calculator")
def calculate_average(numbers: list[float]) -> float:
    """Calculate the arithmetic mean of a list of numbers."""
    return sum(numbers) / len(numbers)

analyzer = create_agent(
    client=client("groq", api_key="your-groq-key"),
    role="analyzer", 
    system_prompt="You are a helpful data analysis assistant.",
    model="llama-3.2-3b-preview",
    tools=[calculate_average]
)

result = analyzer.run("Calculate the average of these numbers: [1000, 1200, 950, 1100]")
print(result)

See our documentation for more examples, including:

  • Custom workflow agents with multiple steps
  • Complex tool integration and selection
  • Advanced usage patterns and best practices

Design Philosophy

The ClientAI Agent module is built on four core principles:

  1. Prompt-Centric Design: Prompts are explicit, debuggable, and transparent. What you see is what is sent to the model.

  2. Customization First: Every component is designed to be extended or overridden. Create custom steps, tool selectors, or entirely new workflow patterns.

  3. Zero Lock-In: Start with high-level components and drop down to lower levels as needed. You can:

    • Extend Agent for custom behavior
    • Use individual components directly
    • Gradually replace parts with your own implementation
    • Or migrate away entirely - no lock-in

Requirements

  • Python: Version 3.9 or newer
  • Dependencies: Core package has minimal dependencies. Provider-specific packages are optional.

Contributing

Contributions are welcome! Please see our Contributing Guidelines for more information.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Igor Magalhaes – @igormagalhaesrigormagalhaesr@gmail.com github.com/igorbenav