/openai

Elisp library for the OpenAI API

Primary LanguageEmacs LispGNU General Public License v3.0GPL-3.0

License: GPL v3 JCS-ELPA

OpenAI.el

Elisp library for the OpenAI API

CI

The OpenAI Elisp library provides convenient access to the OpenAI API from applications written in the Elips language.

P.S. This package is expected to be used as a library, so there are only a few interactable commands you can use, and those are mostly examples.

📚 Documentation

Table of Contents

🔨 Usage

You will need to set up your API key before you can use this library.

(setq openai-key "[YOUR API KEY]")

Alternatively you can configure a function to retrieve the key from some external source. A function, openai-key-auth-source is provided to retrieve the key from an auth-source entry under the :host key api.openai.com

(setq openai-key #'openai-key-auth-source)

For requests that need your user identifier,

(setq openai-user "[YOUR USER UID]")

For using another OpenAI endpoint,

(setq openai-base-url "[OPENAI BASE URL]")

💡 Tip

The two variables openai-key and openai-user are the default values for sending requests! However, you can still overwrite the value by passing the keywords :key and :user!

🔰 The simplest example

Here is the simplest example that teaches you how to use this library. This is a function with a query and a callback function.

(openai-completion "How are you?"
                   (lambda (data)
                     (message "%s" data)))

📨 Sending Request

All arguments are exposed in the argument list, so you can send any request in any way you want.

For example, the request function openai-completion accepts argument max-tokens. By seeing OpenAI's references page:

max_tokens integer Optional Defaults to 16

The maximum number of tokens to generate in the completion.

The token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

(openai-completion ...
                   ...
                   :max-tokens 4069)  ; max out tokens!

📢 API functions

The API functions are followed by this pattern:

[PACKAGE NAME]-[API TYPE]-[REQUEST NAME]

For example:

(openai-file-list ...)
  • openai - is the package name
  • file - is the api type, see OpenAI API reference
  • list - is the request name

🔍 Parameters

The function's parameters are followed in this order:

  1. required - variables are required for this type of request
  2. callback - execution after the request is made
  3. optional - other variables that are not required, but will affect the final output
(openai-completion "How are you?"          ; required
                   (lambda (data)          ; callback
                     ...)
                   :max-tokens 4069)       ; optional

🖥 Setting Model

Every type of request has a default model, and we hope this benefits the users to not worry about what model to use for their request! However, if you want to use other models, you can use the keyword :model to replace them!

(openai-completion ...
                   ... 
                   :model "text-davinci-003")  ; replace the default model

🛑 Debugging

While playing through this library, you might see this error quite often.

400 - Bad request.  Please check error message and your parameters

Try setting the variable openai--show-log to t, it will show more error messages.

📂 Example projects

🔗 References

Contribute

PRs Welcome Elisp styleguide Donate on paypal Become a patron

If you would like to contribute to this project, you may either clone and make pull requests to this repository. Or you can clone the project and establish your own branch of this tool. Any methods are welcome!