-
Notifications
You must be signed in to change notification settings - Fork 316
Description
Checked other resources
- I added a very descriptive title to this issue.
- I searched the LangGraph.js documentation with the integrated search.
- I used the GitHub search to find a similar question and didn't find it.
- I am sure that this is a bug in LangGraph.js rather than my code.
- The bug is not resolved by updating to the latest stable version of LangGraph (or the specific integration package).
Example Code
TypeScript code for the graph router.ts is as follows:
// Tool
const multiplyTool = tool(
async ({ a, b }: { a: number; b: number }) => {
console.log(`Executing multiply tool: ${a}*${b}`);
return a * b;
},
{
name: "multiply",
description: "Multiplies a and b.",
schema: z.object({
a: z.number().describe("first number"),
b: z.number().describe("second number"),
}),
}
);
// LLM with bound tool
const llm = new ChatAnthropic({ model: "claude-3-5-haiku-latest", temperature: 0 });
const tools = [multiplyTool];
const llmWithTools = llm.bindTools(tools);
// Node
async function toolCallingLLM(state: typeof MessagesAnnotation.State) {
const result = await llmWithTools.invoke(state.messages);
return { messages: [result] };
}
enum NodeNames {
TOOL_CALLING_LLM = "tool_calling_llm",
TOOLS = "tools"
}
// Build graph
const builder = new StateGraph<typeof MessagesAnnotation.spec, any, any, NodeNames>(MessagesAnnotation);
builder.addNode(NodeNames.TOOL_CALLING_LLM, toolCallingLLM);
builder.addNode(NodeNames.TOOLS, new ToolNode(tools));
builder.addEdge(START, NodeNames.TOOL_CALLING_LLM);
builder.addConditionalEdges(
NodeNames.TOOL_CALLING_LLM,
toolsCondition
);
builder.addEdge(NodeNames.TOOLS, END);
// Compile graph
export const graph = builder.compile();
Error Message and Stack Trace (if applicable)
When I try to run the graph in LangGraph studio, the tools node always fail:
{
"messages": [
{
"status": "error",
"content": "Error: Received tool input did not match expected schema\n Please fix your mistakes.",
"name": "multiply",
"tool_call_id": "toolu_01Mdkq5hFSaAUmAbwruGUJ34",
"additional_kwargs": {},
"response_metadata": {},
"type": "tool"
},
{
"status": "error",
"content": "Error: Tool \"\" not found.\n Please fix your mistakes.",
"name": "",
"tool_call_id": "fallback-1",
"additional_kwargs": {},
"response_metadata": {},
"type": "tool"
}
]
}
Description
Hello, I use LangGraph to build a simple router graph with tool calling.

TypeScript code for the graph router.ts is as follows:
// Tool
const multiplyTool = tool(
async ({ a, b }: { a: number; b: number }) => {
console.log(`Executing multiply tool: ${a}*${b}`);
return a * b;
},
{
name: "multiply",
description: "Multiplies a and b.",
schema: z.object({
a: z.number().describe("first number"),
b: z.number().describe("second number"),
}),
}
);
// LLM with bound tool
const llm = new ChatAnthropic({ model: "claude-3-5-haiku-latest", temperature: 0 });
const tools = [multiplyTool];
const llmWithTools = llm.bindTools(tools);
// Node
async function toolCallingLLM(state: typeof MessagesAnnotation.State) {
const result = await llmWithTools.invoke(state.messages);
return { messages: [result] };
}
enum NodeNames {
TOOL_CALLING_LLM = "tool_calling_llm",
TOOLS = "tools"
}
// Build graph
const builder = new StateGraph<typeof MessagesAnnotation.spec, any, any, NodeNames>(MessagesAnnotation);
builder.addNode(NodeNames.TOOL_CALLING_LLM, toolCallingLLM);
builder.addNode(NodeNames.TOOLS, new ToolNode(tools));
builder.addEdge(START, NodeNames.TOOL_CALLING_LLM);
builder.addConditionalEdges(
NodeNames.TOOL_CALLING_LLM,
toolsCondition
);
builder.addEdge(NodeNames.TOOLS, END);
// Compile graph
export const graph = builder.compile();
Everything works fine when I simply invoke the graph:
const messages = await graph.invoke({
messages: [new HumanMessage({ content: "What is 6 multiplied by 7?" })]
});
The tool_calling_llm node will respond with AIMessage where the tool call for multiply tool is in one single tool call:
AIMessage {
"id": "msg_01DkVozCzrgPJPRCfMTLEphQ",
"content": [
{
"type": "text",
"text": "I'll help you calculate that by using the multiply function."
},
{
"type": "tool_use",
"id": "toolu_0115uXLnXQ3iFXXky4HYRAFj",
"name": "multiply",
"input": {
"a": 6,
"b": 7
}
}
],
...
"tool_calls": [
{
"name": "multiply",
"args": {
"a": 6,
"b": 7
},
"id": "toolu_0115uXLnXQ3iFXXky4HYRAFj",
"type": "tool_call"
}
],
...
}
However, when I try to run the graph in LangGraph studio, the tools node always fail:
{
"messages": [
{
"status": "error",
"content": "Error: Received tool input did not match expected schema\n Please fix your mistakes.",
"name": "multiply",
"tool_call_id": "toolu_01Mdkq5hFSaAUmAbwruGUJ34",
"additional_kwargs": {},
"response_metadata": {},
"type": "tool"
},
{
"status": "error",
"content": "Error: Tool \"\" not found.\n Please fix your mistakes.",
"name": "",
"tool_call_id": "fallback-1",
"additional_kwargs": {},
"response_metadata": {},
"type": "tool"
}
]
}
This is because tool_calling_llm node responds with AIMessageChunk.
The tool call is split into 2 calls, one for the tool name and the other for the input:
AIMessageChunk {
"id": "msg_01JzyTs4eo3XW6owEBCSEhMe",
"content": [
{
"index": 0,
"type": "text",
"text": "I'll help you calculate 1 times 2 using the multiply function."
},
{
"index": 1,
"type": "tool_use",
"id": "toolu_01Mdkq5hFSaAUmAbwruGUJ34",
"name": "multiply",
"input": ""
},
{
"index": 1,
"input": "{\"a\": 1, \"b\": 2}",
"type": "input_json_delta"
}
],
....
"tool_calls": [
{
"name": "multiply",
"args": {},
"id": "toolu_01Mdkq5hFSaAUmAbwruGUJ34",
"type": "tool_call"
},
{
"name": "",
"args": {
"a": 1,
"b": 2
},
"id": "fallback-1",
"type": "tool_call"
}
],
"tool_call_chunks": [
{
"id": "toolu_01Mdkq5hFSaAUmAbwruGUJ34",
"index": 1,
"name": "multiply",
"args": ""
},
{
"index": 1,
"args": "{\"a\": 1, \"b\": 2}"
}
],
....
}
I have tried disabling messages stream mode next to the Submit button. The issue still persists.

I could reproduce the issue outside LangGraph Studio, if I invoke the graph with stream mode "messages""
const messages = await graph.invoke({
messages: [new HumanMessage({ content: "What is 6 multiplied by 7?" })]
}, { streamMode: ["values", "messages"] });
It looks like disabling messages streaming in LangGraph Studio no longer works.
This prevents me from debugging/testing my graph in LangGraph Studio.
I am using LangGraph cli to run my graph. So there is an issue with LangGraph.js
Appreciate if you could look into this issue.
langgraphjs dev
System Info
"dependencies": {
"@langchain/anthropic": "^0.3.0",
"@langchain/core": "^0.3.76",
"@langchain/langgraph": "^0.4.9",
"dotenv": "^16.0.0",
"zod": "^3.22.0"
}