e2b-dev/fragments

Custom Anthropic baseURL can't response output result

Opened this issue · 3 comments

in models.ts I change the anthropic and openai's baseURL as below shown:

export function getModelClient(model: LLMModel, config: LLMModelConfig) {
const { id: modelNameString, providerId } = model
const { apiKey, baseURL } = config

const providerConfigs = {
anthropic: () => createOpenAI({ apiKey: apiKey || process.env.ANTHROPIC_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),
openai: () => createOpenAI({ apiKey: apiKey || process.env.OPENAI_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),

The application can run and the LLM response can be seen. But it won't give me back the result of running the code and can't preview. The log shows:

model {
id: 'claude-3-5-sonnet-20240620',
provider: 'Anthropic',
providerId: 'anthropic',
name: 'Claude 3.5 Sonnet',
multiModal: true
}
config { model: 'claude-3-5-sonnet-20240620' }
POST /api/chat 200 in 31541ms

I don't have a default anthropic API Keys, so how can i do to solve this problem?

Have you set your E2B_API_KEY in environment variable?

Have you set your E2B_API_KEY in environment variable?

Yes,E2B_API_KEY has been set already