o1-mini 测速失败
Closed this issue · 10 comments
woodchen-ink commented
Provider API error: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. (request id: 2024091309440350917984659626563)
报错提示这个
MartialBE commented
o1弃用maxtokens了
woodchen-ink commented
error": { "code": "unsupported_value", "type": "invalid_request_error", "param": "temperature", "message": "Unsupported value: 'temperature' does not support 0.6 with this model. Only the default (1) value is supported." },
这个也不行
woodchen-ink commented
看起来改了不少呢
woodchen-ink commented
我们能做适配吗, 如果user使用旧参数, 我们自动给他转成新参数. 很多用户端软件都没更新呢
MartialBE commented
MartialBE commented
这些限制 速率也低
woodchen-ink commented
速率会开放的, 刚开始嘛.
感觉这个自我思考的方式之后会成为高智能ai的主流改进方法
MartialBE commented
晚点再改 现在赶高铁🤣🤣🤣
woodchen-ink commented
晚点再改 现在赶高铁🤣🤣🤣
回家过节?
MartialBE commented
是的