-
Notifications
You must be signed in to change notification settings - Fork 4
Fix/celery error #183
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix/celery error #183
Conversation
Backend changes: - Create LLM before graph compilation for proper streaming interception - Change react config from COMPONENT (subgraph) to direct LLM+TOOL nodes - Make build_graph async in all components to enable pre-creation of LLM - Skip final AIMessage only when buffer has content (fix deep research) - Move node transition detection before AIMessage skip check - Access messages directly from state to preserve BaseMessage types Frontend changes: - Fix duplicate detection in streaming_chunk handler (condition was inverted) Co-Authored-By: Claude <[email protected]>
审阅者指南(在小型 PR 上折叠)审阅者指南通过一个辅助函数规范化对状态的访问,将 LLM 图节点的执行函数更新为同时接受基于字典的状态对象和 Pydantic BaseModel 状态对象,从而确保消息提取和提示词渲染的一致性。 更新后 llm_node 状态处理的时序图sequenceDiagram
actor LangGraph
participant llm_node
participant GraphBuilder
participant LLM
LangGraph->>llm_node: call(state: StateDict | BaseModel)
llm_node->>GraphBuilder: _state_to_dict(state)
GraphBuilder-->>llm_node: state_dict
llm_node->>llm_node: extract messages from state_dict[messages]
llm_node->>GraphBuilder: _render_template(prompt_template, state_dict)
GraphBuilder-->>llm_node: prompt
llm_node->>llm_node: build llm_messages
llm_node->>LLM: invoke(llm_messages)
LLM-->>llm_node: response
llm_node-->>LangGraph: updated StateDict
GraphBuilder 和更新后 llm_node 的类图classDiagram
class GraphBuilder {
_build_llm_node(config: GraphNodeConfig) NodeFunction
_state_to_dict(state: StateDict | BaseModel) dict
_render_template(prompt_template: str, state: dict) str
}
class llm_node {
<<function>>
call(state: StateDict | BaseModel) StateDict
}
GraphBuilder o-- llm_node : creates
llm_node ..> GraphBuilder : uses _state_to_dict
llm_node ..> GraphBuilder : uses _render_template
文件级改动
提示与命令与 Sourcery 交互
自定义你的使用体验访问你的 控制台 以:
获取帮助Original review guide in EnglishReviewer's guide (collapsed on small PRs)Reviewer's GuideUpdates the LLM graph node execution function to accept both dict-based and Pydantic BaseModel state objects by normalizing state access through a helper, ensuring consistent message extraction and prompt rendering. Sequence diagram for updated llm_node state handlingsequenceDiagram
actor LangGraph
participant llm_node
participant GraphBuilder
participant LLM
LangGraph->>llm_node: call(state: StateDict | BaseModel)
llm_node->>GraphBuilder: _state_to_dict(state)
GraphBuilder-->>llm_node: state_dict
llm_node->>llm_node: extract messages from state_dict[messages]
llm_node->>GraphBuilder: _render_template(prompt_template, state_dict)
GraphBuilder-->>llm_node: prompt
llm_node->>llm_node: build llm_messages
llm_node->>LLM: invoke(llm_messages)
LLM-->>llm_node: response
llm_node-->>LangGraph: updated StateDict
Class diagram for GraphBuilder and updated llm_nodeclassDiagram
class GraphBuilder {
_build_llm_node(config: GraphNodeConfig) NodeFunction
_state_to_dict(state: StateDict | BaseModel) dict
_render_template(prompt_template: str, state: dict) str
}
class llm_node {
<<function>>
call(state: StateDict | BaseModel) StateDict
}
GraphBuilder o-- llm_node : creates
llm_node ..> GraphBuilder : uses _state_to_dict
llm_node ..> GraphBuilder : uses _render_template
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey - 我在这里提供了一些高层次的反馈:
- 由于
llm_node现在同时接受StateDict和BaseModel,建议重命名该参数的类型别名,或者在其定义附近添加简短注释,以便让未来的调用方更清楚它是一个联合类型(union type)。 - 请确保
_state_to_dict是无副作用且调用开销很小的,因为它现在会在每次节点执行时被调用;如果不是这种情况,可以考虑做结果缓存,或者更严格地收窄其可接受的输入类型。
给 AI Agent 的提示
Please address the comments from this code review:
## Overall Comments
- Since `llm_node` now accepts both `StateDict` and `BaseModel`, consider renaming the parameter type alias or adding a brief comment near its definition to make this union type expectation clearer for future callers.
- Ensure `_state_to_dict` is side‑effect free and cheap to call, as it is now invoked on every node execution; if not, consider caching or constraining the accepted input types more narrowly.帮我变得更有用!请在每条评论上点 👍 或 👎,我会根据你的反馈改进后续评审。
Original comment in English
Hey - I've left some high level feedback:
- Since
llm_nodenow accepts bothStateDictandBaseModel, consider renaming the parameter type alias or adding a brief comment near its definition to make this union type expectation clearer for future callers. - Ensure
_state_to_dictis side‑effect free and cheap to call, as it is now invoked on every node execution; if not, consider caching or constraining the accepted input types more narrowly.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Since `llm_node` now accepts both `StateDict` and `BaseModel`, consider renaming the parameter type alias or adding a brief comment near its definition to make this union type expectation clearer for future callers.
- Ensure `_state_to_dict` is side‑effect free and cheap to call, as it is now invoked on every node execution; if not, consider caching or constraining the accepted input types more narrowly.Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR fixes an issue where the llm_node function incorrectly assumed LangGraph would pass state as a dictionary, when in reality LangGraph passes state as a Pydantic BaseModel instance (created by build_state_class). The fix adds proper type handling to accept both dict and BaseModel inputs.
Changes:
- Updated
llm_nodefunction signature to acceptStateDict | BaseModelinstead of justStateDict - Added
_state_to_dictconversion to safely handle both dict and BaseModel state inputs - Updated comments to accurately reflect that state can be either a dict or Pydantic BaseModel
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
|
||
| # Render prompt template | ||
| prompt = self._render_template(llm_config.prompt_template, state) | ||
| prompt = self._render_template(llm_config.prompt_template, state_dict) |
Copilot
AI
Jan 19, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passing state_dict to _render_template is redundant because _render_template already calls self._state_to_dict(state) internally at line 255. This results in _state_to_dict being called twice on the already-converted dict. Instead, pass state directly to _render_template to avoid the redundant conversion: prompt = self._render_template(llm_config.prompt_template, state)
| prompt = self._render_template(llm_config.prompt_template, state_dict) | |
| prompt = self._render_template(llm_config.prompt_template, state) |
变更内容
简要描述本次 PR 的主要变更内容。
相关 Issue
请关联相关 Issue(如有):#编号
检查清单
默认已勾选,如不满足,请检查。
其他说明
如有特殊说明或注意事项,请补充。
Summary by Sourcery
在构建提示和消息时,统一处理 LLM 图节点状态,支持将
dict和 Pydantic 模型作为输入。Bug 修复:
BaseModel而不是普通dict时,防止出现运行时错误。增强功能:
dict或BaseModel输入中一致地提取消息并渲染模板。Original summary in English
Summary by Sourcery
Handle LLM graph node state uniformly by supporting both dict and Pydantic model inputs when building prompts and messages.
Bug Fixes:
Enhancements: