fix(openai): only include parallel_tool_calls when explicitly enabled #11074
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR attempts to address Issue #11071 where GLM4.5 (and potentially other models) via both LM Studio AND OpenAI-compatible endpoints get stuck in a loop repeatedly reading the same file.
Root Cause
The issue was introduced in commit ed35b09 (Jan 29, 2026) which changed the
parallel_tool_callsparameter default fromfalsetotruefor all OpenAI-compatible providers. Not all models support this parameter, and when sent to models like GLM4.5 that do not support it, the model gets stuck in a loop.Note: This is different from PR #11072 which only fixes LM Studio. This PR fixes both the OpenAI-compatible handler (
openai.ts) AND the base provider class (base-openai-compatible-provider.ts) that is used by multiple providers.Solution
This fix changes the OpenAI handlers to only include the
parallel_tool_callsparameter when explicitly set totrue(i.e., whenmetadata?.parallelToolCalls === true). This approach:Changes
src/api/providers/openai.tsto conditionally includeparallel_tool_calls(4 locations)src/api/providers/base-openai-compatible-provider.tsto conditionally includeparallel_tool_callssrc/api/providers/__tests__/openai.spec.tsto verify the new behaviorTesting
Fixes #11071
Feedback and guidance are welcome.
Important
Fixes issue by conditionally including
parallel_tool_callsparameter only when explicitly enabled in OpenAI handlers.parallel_tool_callsparameter inopenai.tsandbase-openai-compatible-provider.tsonly whenmetadata?.parallelToolCalls === true.parallel_tool_calls(e.g., GLM4.5).openai.spec.tsto verify conditional inclusion ofparallel_tool_calls.openai.spec.ts, 13 tests inbase-openai-compatible-provider.spec.ts).This description was created by
for 58d5f9d. You can customize this summary. It will automatically update as commits are pushed.