iaalm/llama-api-server

is there a standardised way to interact with LLMs

mosajjal opened this issue · 6 comments

Hi.

has anyone come up with a unified, standard API to talk to all LLMs? (LLaMA, Bard, OpenAI etc)? if not, it would be helpful to start defining that as a community and build the compatibility layers for that. keen to hear your thoughts.

iaalm commented

OpenAI API is the de facto standard in this area. That's why I start this project to call custom LLM in OpenAI way. Any suggestion?

I agree that currently OpenAI is considered the standard. However, I don't think OpenAI is committed to keep the API specification intact for the foreseeable future.

My suggestion is to start a community-led group to codify a standard, and start building a compatibility shim between major models (OpenAI, LLaMA etc) according to said model. We can always start with OpenAI's reference and build on top of it.

iaalm commented

Agree, that would be very useful😎

cool :) so how should we start this? need to start a group of some sort and start exchanging ideas on what it would look like.

iaalm commented

Seriously I don't know. But maybe we can define an API spec and implement adaptors for popular models?

sounds good. I can help with Bard.