You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Allows users to send multiple messages while AI is streaming without
interrupting the active response.
Demo:
https://github.com/user-attachments/assets/accde963-2076-4fd7-9941-34c90631b270Closes#522.
A follow-up PR will add 'soft-interruptions', which take effect at the
end of the next content block (text-delta, reasoning-end, tool-result,
tool-error)
## Behavior
**While streaming:**
- Additional messages get queued instead of starting a new stream
- Queued message displays below streaming content with muted styling
- User can edit queued message before stream completes by pressing up
arrow (like regular messages) or the "Edit" button.
**On stream completion:**
- Queued messages auto-send immediately
- Queue supports text, images, and combined messages
**On stream abort (Ctrl+C):**
- Queued message restores to chat input for editing
- Images are preserved
## Known Issues
- **Compaction during streaming**: Cannot queue messages while
compaction is running. This will be fixed in a follow-up PR.
## Implementation
- **MessageQueue service** - Accumulates text, images, and options with
special handling for slash commands
- **AgentSession integration** - Stream-end triggers auto-send,
stream-abort triggers restore
- **QueuedMessage component** - Displays queued content with edit
functionality
- **IPC handlers** - Queue management and message routing
- **Test coverage** - 27 unit tests + 10 integration tests (requires
`ANTHROPIC_API_KEY`)
## Testing
```bash
# Unit tests (no API keys required)
bun test src/services/messageQueue.test.ts
# Integration tests (requires ANTHROPIC_API_KEY)
TEST_INTEGRATION=1 bun x jest tests/ipcMain/queuedMessages.test.ts
```
_Generated with `cmux`_
0 commit comments