When I tried the optillm with my own openai API compatible hosted model I get this error
shamanez opened this issue · 6 comments
I used "no_key"
error': "Error code: 401 - {'error': {'message': 'Incorrect API key provided: no_key. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"
Did you run with -base_url http://your_api_end_point/v1
? The error suggests it is still going to the default endpoint from OpenAI.
Yup. Actually, we were able to sort it out by passing an api_key similar to the openai_api key.
Btw this is some amazing work!
Glad it worked out!
The problem was in some sort of an encryption thing. We just created a dummy API key, which has exactly the same pattern as a real openAI api key.
Oh, I think it just has to be pretending with “sk-“ because I check for that here to ensure it is a valid OpenAI key.
True. Might be good to change the documentation for the future ref