Skip to content
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions internal/registry/model_definitions.go
Original file line number Diff line number Diff line change
Expand Up @@ -365,6 +365,19 @@ func GetGitHubCopilotModels() []*ModelInfo {
SupportedEndpoints: []string{"/responses"},
Thinking: &ThinkingSupport{Levels: []string{"none", "low", "medium", "high", "xhigh"}},
},
{
ID: "gpt-5.4",
Object: "model",
Created: now,
OwnedBy: "github-copilot",
Type: "github-copilot",
DisplayName: "GPT-5.4",
Description: "OpenAI GPT-5.4 via GitHub Copilot",
ContextLength: 200000,
MaxCompletionTokens: 32768,
SupportedEndpoints: []string{"/responses"},
Thinking: &ThinkingSupport{Levels: []string{"low", "medium", "high", "xhigh"}},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For consistency with other /responses-only models like gpt-5.3-codex, consider adding "none" to the Thinking.Levels for gpt-5.4. This would allow users to explicitly disable the thinking feature for this model, which is a common option for similar models.

Suggested change
Thinking: &ThinkingSupport{Levels: []string{"low", "medium", "high", "xhigh"}},
Thinking: &ThinkingSupport{Levels: []string{"none", "low", "medium", "high", "xhigh"}},

},
{
ID: "claude-haiku-4.5",
Object: "model",
Expand Down
23 changes: 23 additions & 0 deletions internal/runtime/executor/github_copilot_executor.go
Original file line number Diff line number Diff line change
Expand Up @@ -577,9 +577,32 @@ func useGitHubCopilotResponsesEndpoint(sourceFormat sdktranslator.Format, model
return true
}
baseModel := strings.ToLower(thinking.ParseSuffix(model).ModelName)
if info := registry.GetGlobalRegistry().GetModelInfo(baseModel, ""); info != nil {
if len(info.SupportedEndpoints) > 0 && !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath) {
return true
}
}
Copy link

Copilot AI Mar 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The registry lookup uses GetModelInfo(baseModel, "") which returns the last-registered/global model info for that ID, not the GitHub Copilot provider-specific info. Since ModelRegistry keys are shared across providers, this can select endpoints from an unrelated provider (depending on registration order) and mis-route Copilot traffic. Consider querying provider-specific info (e.g. provider "github-copilot") or otherwise scoping the lookup to Copilot models only.

Suggested change
if info := registry.GetGlobalRegistry().GetModelInfo(baseModel, ""); info != nil {
if len(info.SupportedEndpoints) > 0 && !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath) {
return true
}
}

Copilot uses AI. Check for mistakes.
for _, info := range registry.GetGitHubCopilotModels() {
if info == nil || !strings.EqualFold(info.ID, baseModel) {
continue
}
if len(info.SupportedEndpoints) > 0 && !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath) {
return true
}
break
}
Copy link

Copilot AI Mar 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

useGitHubCopilotResponsesEndpoint calls registry.GetGitHubCopilotModels() (which constructs the full static model list) on every request and then linearly scans it. This adds avoidable allocations/CPU in a hot path; consider caching a map/set of responses-only Copilot models (or consulting the dynamic registry first and only falling back to static definitions on miss).

Copilot uses AI. Check for mistakes.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The current logic for determining which endpoint to use has a potential flaw in how it prioritizes model definitions. It first checks the dynamic registry, and if the model is found but isn't a /responses-only model, it falls through to check the static list of GitHub Copilot models. This can lead to incorrect behavior if a model has different definitions in the dynamic and static registries, as the dynamic definition should be considered authoritative.

For instance, if a model is dynamically registered with SupportedEndpoints: ["/chat/completions", "/responses"], the first check fails. If a stale static definition for the same model exists with SupportedEndpoints: ["/responses"], the function will incorrectly return true.

The logic should be revised to ensure that if a model is found in the dynamic registry, that definition is used exclusively for the decision, and the static list is only consulted as a fallback.

Suggested change
if info := registry.GetGlobalRegistry().GetModelInfo(baseModel, ""); info != nil {
if len(info.SupportedEndpoints) > 0 && !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath) {
return true
}
}
for _, info := range registry.GetGitHubCopilotModels() {
if info == nil || !strings.EqualFold(info.ID, baseModel) {
continue
}
if len(info.SupportedEndpoints) > 0 && !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath) {
return true
}
break
}
if info := registry.GetGlobalRegistry().GetModelInfo(baseModel, ""); info != nil {
// If model is in dynamic registry, its definition is authoritative.
if len(info.SupportedEndpoints) > 0 {
return !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath)
}
// No endpoint info, so fall through to the codex check.
} else {
// If not in dynamic registry, check static definitions.
for _, info := range registry.GetGitHubCopilotModels() {
if info != nil && strings.EqualFold(info.ID, baseModel) {
if len(info.SupportedEndpoints) > 0 {
return !containsEndpoint(info.SupportedEndpoints, githubCopilotChatPath) && containsEndpoint(info.SupportedEndpoints, githubCopilotResponsesPath)
}
// Found model, but no endpoint info, so fall through to codex check.
break
}
}
}

return strings.Contains(baseModel, "codex")
}

func containsEndpoint(endpoints []string, endpoint string) bool {
for _, item := range endpoints {
if item == endpoint {
return true
}
}
return false
}

// flattenAssistantContent converts assistant message content from array format
// to a joined string. GitHub Copilot requires assistant content as a string;
// sending it as an array causes Claude models to re-answer all previous prompts.
Expand Down
7 changes: 7 additions & 0 deletions internal/runtime/executor/github_copilot_executor_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,13 @@ func TestUseGitHubCopilotResponsesEndpoint_CodexModel(t *testing.T) {
}
}

func TestUseGitHubCopilotResponsesEndpoint_RegistryResponsesOnlyModel(t *testing.T) {
t.Parallel()
if !useGitHubCopilotResponsesEndpoint(sdktranslator.FromString("openai"), "gpt-5.4") {
t.Fatal("expected responses-only registry model to use /responses")
Comment on lines +74 to +77
Copy link

Copilot AI Mar 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test name suggests it exercises the dynamic ModelRegistry path, but it only relies on the static GitHub Copilot model list (no registry setup). Either rename it to reflect the static-definition behavior, or extend it to register a temporary model in the global registry with SupportedEndpoints={"/responses"} and assert the registry-based routing logic is honored.

Suggested change
func TestUseGitHubCopilotResponsesEndpoint_RegistryResponsesOnlyModel(t *testing.T) {
t.Parallel()
if !useGitHubCopilotResponsesEndpoint(sdktranslator.FromString("openai"), "gpt-5.4") {
t.Fatal("expected responses-only registry model to use /responses")
func TestUseGitHubCopilotResponsesEndpoint_StaticResponsesOnlyModel(t *testing.T) {
t.Parallel()
if !useGitHubCopilotResponsesEndpoint(sdktranslator.FromString("openai"), "gpt-5.4") {
t.Fatal("expected responses-only model to use /responses")

Copilot uses AI. Check for mistakes.
}
}

func TestUseGitHubCopilotResponsesEndpoint_DefaultChat(t *testing.T) {
t.Parallel()
if useGitHubCopilotResponsesEndpoint(sdktranslator.FromString("openai"), "claude-3-5-sonnet") {
Expand Down
Loading