didalgolab/chatgpt-intellij-plugin

Claude.ai

cevatkerim opened this issue · 5 comments

Would be great to include Claude.ai as an option.

Hi @cevatkerim, what is your use case?

  • do you have direct access to Claude API, or consider using Claude through third-party providers, such as OpenRouter.ai?
  • do you code privately or professionally. If professionally, do you trust third-party providers, such as OpenRouter.ai, enough to share your code with them?
  • do you believe Claude is better in coding compared to OpenAI's models?

Hi @didalgolab I believe I can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm - we allow you to use any LLM as a drop in replacement for gpt-3.5-turbo.

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Hi @cevatkerim, what is your use case?

  • do you have direct access to Claude API, or consider using Claude through third-party providers, such as OpenRouter.ai?
  • do you code privately or professionally. If professionally, do you trust third-party providers, such as OpenRouter.ai, enough to share your code with them?
  • do you believe Claude is better in coding compared to OpenAI's models?

I meant to have it as an option compared to ChatGPT interface. Claude.ai is quite capable and provides good output.

Long time behind, but support for Claude API (not through embedded browser) is planned for release 1.0.0, currently on branch release/1.0.x.

Available in v1.0.0