/DdgAiProxy

C# libary to interact with Duckduckgo AI

Primary LanguageC#MIT LicenseMIT

DdgAiProxy

DdgAiProxy is a libary to work with Duckduckgo AI and make request with it. Also, repo contains web server to work with libary.

As Server

Not everyone familiar with C#(or other .Net languages), so, if you want, you can use it as web server with http api.

Warning

Web Api now are in early stage of development and I not reccomend to use it in production, especially if you plan to use a lot of dialogs

Docker

Docker Image Size Docker Image Version

Faster way to run it - run in docker container with

docker run -p 14532:5000 -d perdub/ddg-ai-proxy:latest

and after this you can make requests to api.

Binarys Files

If you can`t or don`t want to use Docker, you can run it as standalone application. Compile it by yourself or use autobuilded binarys from release page.

If you have installed .Net on your computer, you can download builds *-isSelfContained-false, otherwise, you should use isSelfContained-true builds which includes all stuff.

Api usage

You can use 2 different apis: custom and OpenAi-compatible. Openai-compatible api intended to use as custom endpoint, so you can replace api endpoint in any program to this api. It`s not working correctly, because by this moment implemented only /chat/completions and only with necessarily fileds in request and responce(from OpenAi OpenApi scheme).

Custom Api

For init dialog, call /base/api/init with model=0 as query param, where value it`s a Number representation

Model Number representation Model Name
GPT-4o mini 0 gpt-4o-mini
Claude 3 Haiku 1 claude-3-haiku-20240307
Llama 3 70B 2 meta-llama/Llama-3-70b-chat-hf
Llama 3.1 70B 5 meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo
Mixtral 8x7B 3 mistralai/Mixtral-8x7B-Instruct-v0.1

Note

Llama 3 70B was updated by Duckduckgo AI to Llama 3.1 70B, but tests still passed with old model and api is working, so use it at your own risk if you want.

Response will be contain a ddg-ai-proxy-guid header, which represent you dialog id. After this, call /base/api/talk with 2 query params: guid - header from previous request and message - you prompt to llm.

As Libary

NuGet Version

Adding

You can add nuget package to you project with

dotnet add package DdgAi

or go to nuget package page.

Usage

Base conservation example:

DialogManager dialogManager = new DialogManager(new CustomClient());
await dialogManager.Init(Model.Gpt3_5_turbo);
string response = await dialogManager.SendMessage("Hello! How are you? And who are you?");

Also, you can find this project at examples folder.
So, what happends here? For first, we create a DialogManager and CustomClient instancs. CustomClient is a derivative from HttpClient class, which helps us to make requests to DuckDuckGo Ai servers. With base usage like this you can don`t care about it, but if you app means usages of 2 or more DialogManager, ```CustomClient``` maybe and should be a singeltone.

After this, with CustomClient instance, we can create our DialogManager to send request to LLM. After creating, we should init dialog with Init() method. As params, it`s takes Model enum with model, which we want to use.

Now we are ready to send and response: call SendMessages(string text) and pass prompt to LLM, function will return LLM response(if all staff are not broken in this moment).

TODO:

  • refactoring libary code
  • refactoring web server code
  • add more tool to control web server(like clear old dialogs(yeah it`s not implemented now))
  • add proxy support
  • Net Standart support
  • Support things like aot and trimming
  • ???