Library for using the OpenAI API
This library provides Scala data types and ZIO services for using the OpenAI API. The examples directory contains a few examples of how to use the different features of the library.
The following example is the translation of OpenAI's official quickstart example:
import zio.{Console, ZIO, ZIOAppDefault}
import zio.openai._
import zio.openai.model.CreateCompletionRequest.{Model, Prompt}
import zio.openai.model.CreateCompletionRequest.Model.Models
import zio.openai.model.Temperature
object Quickstart extends ZIOAppDefault {
def generatePrompt(animal: String): Prompt =
Prompt.String {
s"""Suggest three names for an animal that is a superhero.
|
|Animal: Cat
|Names: Captain Sharpclaw, Agent Fluffball, The Incredible Feline
|Animal: Dog
|Names: Ruff the Protector, Wonder Canine, Sir Barks-a-Lot
|Animal: ${animal.capitalize}
|Names:""".stripMargin
}
def loop =
for {
animal <- Console.readLine("Animal: ")
result <- Completions.createCompletion(
model = Model.Predefined(Models.`Gpt-3.5-turbo-instruct`),
prompt = generatePrompt(animal),
temperature = Temperature(0.6)
)
_ <- Console.printLine("Names: " + result.choices.map(_.text).mkString(", "))
} yield ()
override def run =
loop.forever.provide(Completions.default)
}
The Completions.default
layer initializes the OpenAI client with the default zio-http client configuration and uses
ZIO's built-in configuration
system to get the OpenAI API key. The default configuration provider looks for
the API Key in the OPENAI_APIKEY
environment variable or the openAI.apiKey
system property.
If your project is using zio-http
for other purposes as well and you already have a Client
layer set up, you can use
the live
variants of the layers (Completions.live
) to share the same client.
Start by adding zio-openai
as a dependency to your project:
libraryDependencies += "dev.zio" %% "zio-openai" % "<version>"
Learn more on the ZIO OpenAI homepage!
For the general guidelines, see ZIO contributor's guide.
See the Code of Conduct