-
Notifications
You must be signed in to change notification settings - Fork 61.1k
Open
Description
有个必现的bug, 我如果把模型设置为:o1-mini
max token 设置为 32768
压缩模型设置为 gpt-4o-mini
聊天2次,标题生成错误:{ "error": { "message": "max_tokens is too large: 32768. This model supports at most 16384 completion tokens, whereas you provided 32768. (request id: )", "type": "invalid_request_error", "param": "max_tokens", "code": null }}
Originally posted by @jiangying000 in #5646 (comment)
Metadata
Metadata
Assignees
Labels
No labels