-
-
Notifications
You must be signed in to change notification settings - Fork 389
Description
Is it a request payload issue?
[x] Yes, this is a request payload issue. I am using a client/cURL to send a request payload, but I received an unexpected error.
[ ] No, it's another issue.
If it's a request payload issue, you MUST know
Our team doesn't have any GODs or ORACLEs or MIND READERs. Please make sure to attach the request log or curl payload.
Describe the bug
GitHub Copilot gpt-5.4: bad response status code 400, message: model "gpt-5.4" is not accessible via the /chat/completions endpoint, body: {"error":{"message":"model "gpt-5.4" is not accessible via the /chat/completions endpoint","code":"unsupported_api_for_model"}}
CLI Type
OpenAI Compatible
Model Name
gpt-5.4
LLM Client
OpenCode or simple HTTP requests
Expected behavior
Like gpt-5.x-codex, auto translate to Responses API.
OS Type
- OS: Linux
- Version v6.8.51-0