Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion gptel-curl.el
Original file line number Diff line number Diff line change
Expand Up @@ -281,6 +281,9 @@ Optional RAW disables text properties and transformation."
(set-marker-insertion-type tracking-marker t)
(plist-put info :tracking-marker tracking-marker))
(goto-char tracking-marker)
(when (plist-get info :last-was-tool-result)
(insert gptel-response-separator)
Copy link
Owner

@karthink karthink May 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's another edge case.

When you test Claude it usually responds with text ("I'll run this tool for you"), then a tool call, then more text ("The results of the tool are X"). This fix works fine for those cases.

Suppose the LLM responds instead with a tool call followed by a response, as Gemini and OpenAI models do. (No preamble before the tool call.)

Then you get a response that looks like this (with response-prefix Response: )

User: What is the time in Greece?

Response:

The time in Greece is ...

instead of

User: What is the time in Greece?

Response: The time in Greece is...

The extra gptel-response-separator between Response: and the result is from your :last-was-tool-call tracking.

(plist-put info :last-was-tool-result nil))
(unless raw
(when transformer
(setq response (funcall transformer response)))
Expand All @@ -295,7 +298,8 @@ Optional RAW disables text properties and transformation."
(`(tool-call . ,tool-calls)
(gptel--display-tool-calls tool-calls info))
(`(tool-result . ,tool-results)
(gptel--display-tool-results tool-results info))))
(gptel--display-tool-results tool-results info)
(plist-put info :last-was-tool-result t))))

(defun gptel-curl--stream-filter (process output)
(let* ((fsm (car (alist-get process gptel--request-alist)))
Expand Down
2 changes: 1 addition & 1 deletion gptel-openai.el
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,7 @@ Mutate state INFO with response metadata."
:messages [,@prompts]
:stream ,(or gptel-stream :json-false)))
(reasoning-model-p ; TODO: Embed this capability in the model's properties
(memq gptel-model '(o1 o1-preview o1-mini o3-mini o3 o4-mini))))
(memq gptel-model '(o1 o1-preview o1-mini o3-mini o3 o4-mini gpt-5 gpt-5-mini gpt-5-nano))))
(when (and gptel-temperature (not reasoning-model-p))
(plist-put prompts-plist :temperature gptel-temperature))
(when gptel-use-tools
Expand Down