Commit ab5792a
committed
feat: Add MaxTokens option for AI model output control
Introduce a new `MaxTokens` flag and configuration option to allow users to specify the maximum number of tokens to generate in AI model responses.
This option is integrated across:
- Anthropic: Uses `MaxTokens` for `MessageNewParams`.
- Gemini: Sets `MaxOutputTokens` in `GenerateContentConfig`.
- Ollama: Sets `num_predict` option in chat requests.
- Dryrun: Includes `MaxTokens` in the formatted output.
Update example configuration to include `maxTokens` with a descriptive comment.
Resolved test issues
Resolved test issues1 parent 70f8c01 commit ab5792a
File tree
9 files changed
+689
-7
lines changed- cmd/generate_changelog/incoming
- internal
- cli
- core
- plugins/ai
- anthropic
- dryrun
- gemini
- ollama
9 files changed
+689
-7
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
17 | 17 | | |
18 | 18 | | |
19 | 19 | | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
20 | 23 | | |
21 | 24 | | |
22 | 25 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
102 | 102 | | |
103 | 103 | | |
104 | 104 | | |
| 105 | + | |
105 | 106 | | |
106 | 107 | | |
107 | 108 | | |
| |||
457 | 458 | | |
458 | 459 | | |
459 | 460 | | |
| 461 | + | |
460 | 462 | | |
461 | 463 | | |
462 | 464 | | |
| |||
0 commit comments