Features:
- Custom Identity Prompt: Both versions of the agent enhance user experience by responding with a unique identity.
- Teachability: Integrates a learning component allowing the agent to improve based on past interactions, present in both versions.
- User Proxy Agent: Simulates user interactions, useful for testing and development, used in both versions.
- Self-Reflection and API Interaction: Exclusive to the advanced version, enabling the agent to reflect on its thoughts and interact directly with APIs.
Ensure Python 3.8+ is installed, then install dependencies from requirements.txt
:
autogen==1.0.16
virtualenv
requests # Required for the advanced version
- Clone this repository:
git clone <repository-url>
- Navigate to the cloned directory:
cd <repository-directory>
- Install the required Python packages:
pip install -r requirements.txt
Run the agent by executing the Python script:
python custom_conversable_agent.py
Disabling tokenizer parallelism to avoid multi-threading issues:
import os
os.environ["TOKENIZERS_PARALLELISM"] = "false"
Enhancements include API interactions and added methods for self-reflection:
from autogen import ConversableAgent
import requests
class CustomConversableAgent(ConversableAgent):
# Initialization includes storing API keys and base URL
def __init__(self, name, llm_config, identity_prompts, *args, **kwargs):
...
# Enhanced request handling to include self-reflection and API requests
def handle_request(self, message):
...
# Self-reflection method
def reflect(self, message):
...
# API interaction method
def llm_predict(self, prompt):
...
Enabling the agent to learn and adapt from user interactions is consistent across both versions:
from autogen.agentchat.contrib.capabilities.teachability import Teachability
teachability = Teachability(
reset_db=False,
path_to_db_dir="./tmp/interactive/teachability_db"
)
Starts the chat with a welcoming message:
def initiate_chat():
...