/actionai

A small library to call local functions using openai function calling

Primary LanguagePythonMIT LicenseMIT

ActionAI

A small library to run local functions using the openai function calling

Warning This library is still in its early stages, so please use it cautiously. If you find any bugs, please create a new issue.

Install

pip install actionai

Usage

Note A function must be fully typed and must have a docstring

# define a new function
def get_current_weather(location: str, unit: str = "fahrenheit"):
    """Function to get current weather for the given location"""
    weather_info = {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return weather_info


import actionai

action = actionai.ActionAI()
action.register(get_current_weather)

response = action.prompt("What is the current weather in the north pole?")

print(response["choices"][0]["message"]["content"])
# The current weather at the North Pole is 72°F. It is sunny and windy.

The openai api key will be read automatically from the OPENAI_API_KEY environment variable. You can pass it manually as,

import actionai

action = actionai.ActionAI(openai_api_key="YOUR_KEY")

Adding context

Sometimes your function will have variables that need to be set by the program.

def list_todos(user: str):
    """Function to list all todos"""
    return todos[user]

action = actionai.ActionAI(context={"user": "jason"})

The context keys are skipped when creating json schema. The values are directly passed at the time of the function call.

Choosing a model

By default, the completion run on the gpt-3.5-turbo-0613 model. You can change the model using the model argument.

import actionai

action = actionai.ActionAI(model="gpt-4")

You can see the complete list of supported chat completion models here

Demo

Running todo example 👇🏻

todo demo

For more examples, check out the examples directory.