-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Support Vercel AI Data Stream Protocol #2923
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Docs Preview
|
This would be a wonderful addon, looking forward to see it soon in main. |
@samuelcolvin can you add it to docs such that it would be integrated as chat service in existing Fullstack apps that uses pydantic-ai as Agent stack. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we rename the directory to just vercel_ai
to keep things simple.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for this! I left some comments on the code and I will try to test this and get back.
pydantic_ai_slim/pydantic_ai/vercel_ai_elements/request_types.py
Outdated
Show resolved
Hide resolved
pydantic_ai_slim/pydantic_ai/vercel_ai_elements/response_stream.py
Outdated
Show resolved
Hide resolved
pydantic_ai_slim/pydantic_ai/vercel_ai_elements/response_stream.py
Outdated
Show resolved
Hide resolved
if not isinstance(event, AgentRunResultEvent): | ||
async for chunk in event_streamer.event_to_chunks(event): | ||
yield chunk.sse() | ||
async for chunk in event_streamer.finish(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are not sending any EndChunk like TextEndChunk or ReasoningEndChunk. Not sure how important that is. I had them in my implementation: https://gist.github.com/183amir/ce45cf52f034b493fac0bb4b1838236a#file-vercel_ai-py-L497
class DataUIPart(CamelBaseModel): | ||
"""Data part with dynamic type based on data name.""" | ||
|
||
type: str # Will be f"data-{NAME}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we can have type: str = pydantic.Field(pattern=r"^data-.*$")
@@ -0,0 +1,273 @@ | |||
"""Convert to Python from. | |||
|
|||
https://github.com/vercel/ai/blob/ai%405.0.34/packages/ai/src/ui/ui-messages.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a newer version, I have to check the changes: https://github.com/vercel/ai/blob/ai%405.0.59/packages/ai/src/ui/ui-messages.ts
yield _t.ToolOutputAvailableChunk(tool_call_id=tool_call_id, output=content) | ||
case messages.RetryPromptPart(tool_name=tool_name, tool_call_id=tool_call_id, content=content): | ||
yield _t.ToolOutputAvailableChunk(tool_call_id=tool_call_id, output=content) | ||
case messages.BuiltinToolCallEvent(part=part): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See if we need to handle these differently, so they are also interpreted as builtin tool call parts when sent back
async def finish(self) -> AsyncIterator[_t.AbstractSSEChunk | DoneChunk]: | ||
"""Send extra messages required to close off the stream.""" | ||
if tool_call_id := self._final_result_tool_id: | ||
yield _t.ToolOutputAvailableChunk(tool_call_id=tool_call_id, output=None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to do this in AG-UI as well: #3011
@@ -0,0 +1,216 @@ | |||
"""Convert to Python from. | |||
|
|||
https://github.com/vercel/ai/blob/ai%405.0.34/packages/ai/src/ui/ui-messages.ts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update to latest version
if isinstance(part, TextUIPart): | ||
prompt.append(part.text) | ||
else: | ||
return JSONResponse({'errors': 'only text parts are supported yet'}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: Support more of:
UIMessagePart = (
TextUIPart
| ReasoningUIPart
| ToolUIPart
| DynamicToolUIPart
| SourceUrlUIPart
| SourceDocumentUIPart
| FileUIPart
| DataUIPart
| StepStartUIPart
)
provider_metadata: ProviderMetadata | None = None | ||
|
||
|
||
class FileChunk(AbstractSSEChunk): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use this for #2970
|
||
|
||
# Dynamic tool part states as separate models | ||
class DynamicToolInputStreamingPart(CamelBaseModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To check if these are frontend tools
# Conflicts: # examples/pydantic_ai_examples/chat_app.py # pydantic_ai_slim/pydantic_ai/agent/abstract.py # pydantic_ai_slim/pydantic_ai/run.py
"""Handle a FinalResultEvent, tracking the final result tool.""" | ||
if event.tool_call_id and event.tool_name: | ||
self._final_result_tool_id = event.tool_call_id | ||
yield ToolInputStartChunk(tool_call_id=event.tool_call_id, tool_name=event.tool_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't look right.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from fastapi import Depends, Request, Response | ||
|
||
from pydantic_ai import Agent, RunContext | ||
from pydantic_ai.vercel_ai.starlette import StarletteChat |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix this
yield event_str | ||
|
||
|
||
# _ToolCallNotFoundError is defined here (not in ui/ag_ui) since it's specific to this module |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why though?
that transform Pydantic AI agent events into protocol-specific events (e.g., AG-UI, Vercel AI). | ||
""" | ||
|
||
# pyright: reportIncompatibleMethodOverride=false, reportUnknownVariableType=false, reportGeneralTypeIssues=false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't do this
|
||
from ..messages import ( | ||
AgentStreamEvent, | ||
BuiltinToolCallEvent, # type: ignore[reportDeprecated] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't handle these
|
||
|
||
class BaseEventStream(ABC, Generic[EventT, AgentDepsT]): | ||
"""Base class for transforming pAI agent events into protocol-specific events. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't call it pAI
Returns: | ||
A new UUID-based message ID. | ||
""" | ||
self.message_id = str(uuid4()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move to AG-UI
case ThinkingPart(): | ||
async for e in self.handle_thinking_start(part): | ||
yield e | ||
case ToolCallPart() | BuiltinToolCallPart(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Split up
self.message_id = str(uuid4()) | ||
return self.message_id | ||
|
||
async def agent_event_to_events(self, event: AgentStreamEvent | AgentRunResultEvent) -> AsyncIterator[EventT]: # noqa: C901 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
handle_event
?
async for e in self.handle_builtin_tool_return(part): | ||
yield e | ||
case FilePart(): | ||
# FilePart is not currently handled by UI protocols |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be
async for e in self.handle_run_result(event): | ||
yield e | ||
|
||
# Granular part handlers (abstract - must implement) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Allow not implementing all
# Error handling (must implement) | ||
|
||
@abstractmethod | ||
async def on_validation_error(self, error: Exception) -> AsyncIterator[EventT]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should arguably be on the adapter, not the event stream
""" | ||
|
||
@abstractmethod | ||
def encode_event(self, event: EventT) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Likely unnecessary. Also the 2 adapters don't inherit from this at all. Dumb Claude.
@183amir Thanks for sharing! I've done quite a lot of (Claude-assisted) refactoring in this PR so those commits don't cleanly apply anymore, but I'll take them into consideration. Note also that this branch may very well be broken right now, I'll fix that today :) |
This adds support for Vercel AI Elements streams to Pydantic AI.
There's an example frontend in a separate repo github.com/pydantic/pydantic-ai-chat the plan is to either make that into a more complete template, or release a
pydantic-ai-chat
python library which contains a pre-built react frontend.@DouweM we should rename this to use the proper terminology of the "Vercel AI Data Stream Protocol"
Here's a demo of basic usage:
pydantic-ai-chat.mp4