Commit 60e5cbc
committed
feat: Add MaxTokens option for AI model output control
- Add --max-tokens flag to control maximum token output
- Support max_completion_tokens for OpenAI GPT-5 models
- Update all AI providers (Anthropic, OpenAI, Gemini, Ollama, DryRun)
- Add MaxTokens configuration to example.yaml
- Update help documentation and translations
- Add changelog entry for feature [#1747](https://github.com/2b3pro/fabric/issues/1747)1 parent e513172 commit 60e5cbc
1 file changed
+1
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
259 | 259 | | |
260 | 260 | | |
261 | 261 | | |
262 | | - | |
| 262 | + | |
263 | 263 | | |
264 | 264 | | |
265 | 265 | | |
| |||
0 commit comments