Skip to content

Conversation

@iskng
Copy link
Contributor

@iskng iskng commented Dec 8, 2025

Summary

  • Fixes panic when using .with_hook() on streaming prompts without prior chat history
  • Fixes incorrect hook semantics where wrong message was passed as prompt

Problem

The streaming implementation's on_completion_call hook was incorrectly trying to get the prompt from chat_history.last() instead of using the current_prompt variable. This caused:

  1. Panic when using .with_hook() without prior chat history (empty vec)
  2. Wrong semantics - hook received last history message (could be an assistant message) instead of the actual user prompt being sent

Root Cause

The streaming architecture keeps current_prompt separate from chat_history until after the completion (line 267). But the hook code was trying to extract the prompt from chat_history.last(), which doesn't contain the current prompt at that point.

Fix

Use current_prompt directly in the hook call, and pass the full chat_history (which correctly represents prior conversation only):

// Before (buggy)
let prompt = reader.last().cloned().expect("...");
let chat_history_except_last = reader[..reader.len() - 1].to_vec();
hook.on_completion_call(&prompt, &chat_history_except_last, ...)

// After (fixed)
hook.on_completion_call(&current_prompt, &reader.to_vec(), ...)

Test plan

  • cargo check -p rig-core passes
  • cargo test -p rig-core --lib passes (172 tests)
  • Manual test with streaming hook and empty history (no longer panics)

Fixes #1131

The streaming implementation's `on_completion_call` hook was incorrectly
trying to get the prompt from `chat_history.last()` instead of using the
`current_prompt` variable.

This caused two bugs:
1. Panic when using `.with_hook()` without prior chat history (empty vec)
2. Wrong semantics - hook received last history message instead of actual prompt

The fix uses `current_prompt` directly, which aligns with the streaming
architecture where the prompt is kept separate from chat_history until
after the completion.

Fixes 0xPlaygrounds#1131
@iskng iskng force-pushed the fix/streaming-hook-panic branch from 0f6a77b to 7f29c19 Compare December 8, 2025 21:15
@joshua-mo-143
Copy link
Collaborator

Hi, thanks for the PR! I'll be testing this over today/tomorrow or so, whenever I get the chance.

Copy link
Collaborator

@joshua-mo-143 joshua-mo-143 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, tested locally

@joshua-mo-143 joshua-mo-143 added this pull request to the merge queue Dec 10, 2025
Merged via the queue into 0xPlaygrounds:main with commit 5bb85ca Dec 10, 2025
5 checks passed
@github-actions github-actions bot mentioned this pull request Dec 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: StreamingPromptHook panics when chat_history is empty

2 participants