/ollama4j

Java library for interacting with Ollama server.

Primary LanguageJavaMIT LicenseMIT

Ollama4j

ollama4j-icon

A Java library (wrapper/binding) for Ollama server.

Find more details on the website.

GitHub stars GitHub forks GitHub watchers Contributors GitHub License

JitPack Downloads This Month Badge JitPack Downloads This Week Badge JitPack Downloads Per Month Badge GitHub Downloads (all assets, all releases)

GitHub last commit codecov Build Status

Table of Contents

How does it work?

  flowchart LR
    o4j[Ollama4j]
    o[Ollama Server]
    o4j -->|Communicates with| o;
    m[Models]
    subgraph Ollama Deployment
        direction TB
        o -->|Manages| m
    end
Loading

Requirements

Java

Or

Installation

Note

We have migrated the package repository from Maven Central to GitHub package repository due to technical issues with publishing. Please update your repository settings to get latest version of Ollama4j.

Track the releases here and update the dependency version according to your requirements.

For Maven

Using JitPack

JitPack

  1. Add jitpack.io repository to your project's pom.xml or your settings.xml:
<repositories>
    <repository>
        <id>jitpack.io</id>
        <url>https://jitpack.io</url>
    </repository>
</repositories>
  1. In your Maven project, add this dependency:
<dependency>
    <groupId>io.github.amithkoujalgi</groupId>
    <artifactId>ollama4j</artifactId>
    <version>1.0.74</version>
</dependency>

Using GitHub's Maven Package Repository

  1. Add GitHub Maven Packages repository to your project's pom.xml or your settings.xml:
<repositories>
    <repository>
        <id>github</id>
        <name>GitHub Apache Maven Packages</name>
        <url>https://maven.pkg.github.com/amithkoujalgi/ollama4j</url>
        <releases>
            <enabled>true</enabled>
        </releases>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>
  1. Add GitHub server to settings.xml. (Usually available at ~/.m2/settings.xml)
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                      http://maven.apache.org/xsd/settings-1.0.0.xsd">
    <servers>
        <server>
            <id>github</id>
            <username>YOUR-USERNAME</username>
            <password>YOUR-TOKEN</password>
        </server>
    </servers>
</settings>
  1. In your Maven project, add this dependency:
<dependency>
    <groupId>io.github.amithkoujalgi</groupId>
    <artifactId>ollama4j</artifactId>
    <version>1.0.74</version>
</dependency>
For Gradle
  1. Add the JitPack repository to your build file

Add it in your root build.gradle at the end of repositories:

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
    repositories {
        mavenCentral()
        maven { url 'https://jitpack.io' }
    }
}
  1. Add the dependency
dependencies {
    implementation 'com.github.amithkoujalgi:ollama4j:Tag'
}

API Spec

Tip

Find the full API specifications on the website.

Development

Build:

make build

Run unit tests:

make ut

Run integration tests:

make it

Releases

Newer artifacts are published via GitHub Actions CI workflow when a new release is created from main branch.

Who's using Ollama4j?

Traction

Star History Chart

Areas of improvement

  • Use Java-naming conventions for attributes in the request/response models instead of the snake-case conventions. ( possibly with Jackson-mapper's @JsonProperty)
  • Fix deprecated HTTP client code
  • Setup logging
  • Use lombok
  • Update request body creation with Java objects
  • Async APIs for images
  • Support for function calling with models like Mistral
    • generate in sync mode
    • generate in async mode
  • Add custom headers to requests
  • Add additional params for ask APIs such as:
    • options: additional model parameters for the Modelfile such as temperature - Supported params.
    • system: system prompt to (overrides what is defined in the Modelfile)
    • template: the full prompt or prompt template (overrides what is defined in the Modelfile)
    • context: the context parameter returned from a previous request, which can be used to keep a short conversational memory
    • stream: Add support for streaming responses from the model
  • Add test cases
  • Handle exceptions better (maybe throw more appropriate exceptions)

Get Involved

Open Issues Closed Issues Open PRs Closed PRs Discussions

Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.

References

Credits

The nomenclature and the icon have been adopted from the incredible Ollama project.

Thanks to the amazing contributors

Appreciate my work?

Buy Me A Coffee