langgraph流式输出特性测试

9 阅读24分钟

langgraph在astream调用模式下, 当图中包含子图节点时, 输出捕获级别会因设置产生不一样的影响

本次记录个人基于 langgraph v1.1.2 的测试结果 准备工作 导入包如下:

from langchain_core.messages import AIMessage, HumanMessage
from langgraph.graph import StateGraph, START, END
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from typing import Any, TypedDict, Annotated
import operator
from dotenv import load_dotenv
from langgraph.graph.state import RunnableConfig

定义简单的测试图

  1. 定义状态
class State(TypedDict):
    messages: Annotated[list, operator.add]
    log_info: dict[str, Any] # 测试state默认更新策略
  1. 创建LLM Client
llm = ChatOpenAI(model=xx, base_url=xx, api_key=xx, streaming=True)
  1. 创建一个agent
def get_weather(query: str) -> str:
    """天气查询工具"""
    return f"Result for {query}: 晴"

worker_agent = create_agent(
    model=llm,
    tools=[get_weather],
    name="worker"
)
  1. 定义一些测试节点
async def wrapped_worker_node(state: State, config: RunnableConfig):
    # --- [前置操作] ---
    print(f"[Pre-processing] Starting worker for user input...")
    # 你可以在这里修改 state,或者记录日志到外部系统
    
    print(" [Streaming] Starting to stream worker output...")
    async for chunk in worker_agent.astream(state, config=config):
        print(f"Received chunk: {chunk}")

    # --- [后置操作] ---
    print(f"[Post-processing] Worker finished.")
    
    # 返回状态更新
    return {"messages": [AIMessage(content="")]}
    
    
def node1(state: State):
    return {"log_info": {"node1": f"node1 执行日志"}}


def node2(state: State):
    print("node2 state:", state)
    return {
            "messages": [{"role": "assistant", "content": "测试消息2"}],
            "log_info": {"node2": f"node2 执行日志"}
        }


async def node3(state: State):
    print("node3 执行时,状态为: \n", state)
    # 自定义流式
    input_messages = [HumanMessage(content="你好,请调用工具查询广州气温")]
    full_output = ""
    async for chunk in llm.astream(input_messages):
        full_output += chunk.content # type: ignore
    print(f"Full output: {full_output}")
    return {"messages": [AIMessage(content=full_output)]}


def node4(state: State):
    print("node4 state:", state)
    return {"log_info": {"node4": f"node4 执行日志"}}

1. 节点内部调用 LLM Client 直接流式获取模型回答

构建如下的图, node3 为使用 LLM client 而非CompiledGraph获取回答的节点

builder = StateGraph(State)
builder.add_node("node1", node1)
builder.set_entry_point("node1")
builder.add_node("node3", node3)
builder.add_edge("node1", "node3")
builder.add_node("node4", node4)
builder.add_edge("node3", "node4")
builder.add_edge("node4", END)

main_graph = builder.compile()

inputs: State = {"messages": [HumanMessage(content="你好,请调用工具查询天气")], "log_info": {}}
async for event in main_graph.astream(inputs, stream_mode=["updates", "messages"], version="v2"):
    print(f"Event: {event}")

输出结果为:

Event: {'type': 'updates', 'ns': (), 'data': {'node1': {'log_info': {'node1': 'node1 执行日志'}}}}
node3 执行时,状态为: 
 {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={})], 'log_info': {'node1': 'node1 执行日志'}}
Event: {'type': 'messages', 'ns': (), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5913-8c8c-79f1-b4b4-dca6518ef8d4', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'langgraph_step': 2, 'langgraph_node': 'node3', 'langgraph_triggers': ('branch:to:node3',), 'langgraph_path': ('__pregel_pull', 'node3'), 'langgraph_checkpoint_ns': 'node3:39982d80-fcbc-e4b9-a6f2-454156958ef6', 'checkpoint_ns': 'node3:39982d80-fcbc-e4b9-a6f2-454156958ef6', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': (), 'data': (AIMessageChunk(content='我', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5913-8c8c-79f1-b4b4-dca6518ef8d4', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'langgraph_step': 2, 'langgraph_node': 'node3', 'langgraph_triggers': ('branch:to:node3',), 'langgraph_path': ('__pregel_pull', 'node3'), 'langgraph_checkpoint_ns': 'node3:39982d80-fcbc-e4b9-a6f2-454156958ef6', 'checkpoint_ns': 'node3:39982d80-fcbc-e4b9-a6f2-454156958ef6', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
省略后面的流式chunk...
Full output: 我目前无法直接调用实时天气工具或外部API来查询气温。不过,你可以通过以下方式获取广州的实时气温:

1. 打开手机上的天气应用(如苹果天气、安卓天气通等);
2. 在浏览器中搜索“广州今天气温”;
3. 使用权威天气网站,如中国气象局官网(http://www.cma.gov.cn)或中央气象台;
4. 使用第三方平台如墨迹天气、彩云天气、百度天气或微信天气小程序。

通常,广州当前气温在20°C–35°C之间(视季节而定),建议你查看实时数据以获取准确信息,包括湿度、风力和空气质量等。

如果你告诉我具体日期或想了解未来几天的预报,我也可以帮你分析趋势或提供穿衣建议!
Event: {'type': 'messages', 'ns': (), 'data': (AIMessage(content='我目前无法直接调用实时天气工具或外部API来查询气温。不过,你可以通过以下方式获取广州的实时气温:\n\n1. 打开手机上的天气应用(如苹果天气、安卓天气通等);\n2. 在浏览器中搜索“广州今天气温”;\n3. 使用权威天气网站,如中国气象局官网(http://www.cma.gov.cn)或中央气象台;\n4. 使用第三方平台如墨迹天气、彩云天气、百度天气或微信天气小程序。\n\n通常,广州当前气温在20°C–35°C之间(视季节而定),建议你查看实时数据以获取准确信息,包括湿度、风力和空气质量等。\n\n如果你告诉我具体日期或想了解未来几天的预报,我也可以帮你分析趋势或提供穿衣建议!', additional_kwargs={}, response_metadata={}, id='132a3911-a871-480a-9b6f-af11f1262790', tool_calls=[], invalid_tool_calls=[]), {'langgraph_step': 2, 'langgraph_node': 'node3', 'langgraph_triggers': ('branch:to:node3',), 'langgraph_path': ('__pregel_pull', 'node3'), 'langgraph_checkpoint_ns': 'node3:39982d80-fcbc-e4b9-a6f2-454156958ef6'})}
Event: {'type': 'updates', 'ns': (), 'data': {'node3': {'messages': [AIMessage(content='我目前无法直接调用实时天气工具或外部API来查询气温。不过,你可以通过以下方式获取广州的实时气温:\n\n1. 打开手机上的天气应用(如苹果天气、安卓天气通等);\n2. 在浏览器中搜索“广州今天气温”;\n3. 使用权威天气网站,如中国气象局官网(http://www.cma.gov.cn)或中央气象台;\n4. 使用第三方平台如墨迹天气、彩云天气、百度天气或微信天气小程序。\n\n通常,广州当前气温在20°C–35°C之间(视季节而定),建议你查看实时数据以获取准确信息,包括湿度、风力和空气质量等。\n\n如果你告诉我具体日期或想了解未来几天的预报,我也可以帮你分析趋势或提供穿衣建议!', additional_kwargs={}, response_metadata={}, id='132a3911-a871-480a-9b6f-af11f1262790', tool_calls=[], invalid_tool_calls=[])]}}}
node4 state: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}), AIMessage(content='我目前无法直接调用实时天气工具或外部API来查询气温。不过,你可以通过以下方式获取广州的实时气温:\n\n1. 打开手机上的天气应用(如苹果天气、安卓天气通等);\n2. 在浏览器中搜索“广州今天气温”;\n3. 使用权威天气网站,如中国气象局官网(http://www.cma.gov.cn)或中央气象台;\n4. 使用第三方平台如墨迹天气、彩云天气、百度天气或微信天气小程序。\n\n通常,广州当前气温在20°C–35°C之间(视季节而定),建议你查看实时数据以获取准确信息,包括湿度、风力和空气质量等。\n\n如果你告诉我具体日期或想了解未来几天的预报,我也可以帮你分析趋势或提供穿衣建议!', additional_kwargs={}, response_metadata={}, id='132a3911-a871-480a-9b6f-af11f1262790', tool_calls=[], invalid_tool_calls=[])], 'log_info': {'node1': 'node1 执行日志'}}
Event: {'type': 'updates', 'ns': (), 'data': {'node4': {'log_info': {'node4': 'node4 执行日志'}}}}

  • 观察typemessages的部分, 这是langgraph所捕获LLM Token输出, 可以看到正常捕获到了单节点内部的流式输出
  • 有一个messages事件, 包含完整消息, 在节点return前后被捕获(似乎仅适用于LLM Client流式调用的情况, 使用CompiledGraph则不会包含, 观察测试3的结果可以印证)

2. 函数包装的内部存在子图调用的节点

inputs: State = {"messages": [HumanMessage(content="你好,请调用工具查询天气")], "log_info": {}}
builder1 = StateGraph(State)
builder1.add_node("wrapped_worker", wrapped_worker_node)
builder1.add_node("node1", node1)
builder1.add_node("node2", node2)
builder1.add_node("node4", node4)


builder1.add_edge(START, "wrapped_worker")
builder1.add_edge("wrapped_worker", "node1")
builder1.add_edge("node1", "node2")
builder1.add_edge("node2", "node4")
builder1.add_edge("node4", END)
graph1 = builder1.compile()

async for event in graph1.astream(inputs, config, stream_mode=["updates", "messages"], version="v2"):
    print(f"Event: {event}")

结果输出如下:

[Pre-processing] Starting worker for user input... 
[Streaming] Starting to stream worker output... 
Received chunk: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1')]} 
Received chunk: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1'), AIMessage(content='', additional_kwargs={}, response_metadata={'finish_reason': 'tool_calls', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5901-c0ed-7b00-98b2-4f7247de658a', tool_calls=[{'name': 'get_weather', 'args': {'query': '北京'}, 'id': 'call_a8a2417e6f1b458a84f9b0', 'type': 'tool_call'}], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}})]} 
Received chunk: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1'), AIMessage(content='', additional_kwargs={}, response_metadata={'finish_reason': 'tool_calls', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5901-c0ed-7b00-98b2-4f7247de658a', tool_calls=[{'name': 'get_weather', 'args': {'query': '北京'}, 'id': 'call_a8a2417e6f1b458a84f9b0', 'type': 'tool_call'}], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}), ToolMessage(content='Result for 北京: 晴', name='get_weather', id='9d7f4a03-d906-4d7a-bf11-39c06111bd03', tool_call_id='call_a8a2417e6f1b458a84f9b0')]} 
Received chunk: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1'), AIMessage(content='', additional_kwargs={}, response_metadata={'finish_reason': 'tool_calls', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5901-c0ed-7b00-98b2-4f7247de658a', tool_calls=[{'name': 'get_weather', 'args': {'query': '北京'}, 'id': 'call_a8a2417e6f1b458a84f9b0', 'type': 'tool_call'}], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}), ToolMessage(content='Result for 北京: 晴', name='get_weather', id='9d7f4a03-d906-4d7a-bf11-39c06111bd03', tool_call_id='call_a8a2417e6f1b458a84f9b0'), AIMessage(content='北京的天气是晴天。', additional_kwargs={}, response_metadata={'finish_reason': 'stop', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5901-c44d-7d62-9ecb-31caa673c32d', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}})]}
[Post-processing] Worker finished. 
Event: {'type': 'updates', 'ns': (), 'data': {'wrapped_worker': {'messages': [{'role': 'assistant', 'content': '测试消息1'}]}}} 
Event: {'type': 'updates', 'ns': (), 'data': {'node1': {'log_info': {'node1': 'node1 执行日志'}}}} 
node2 state: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1'), {'role': 'assistant', 'content': '测试消息1'}], 'log_info': {'node1': 'node1 执行日志'}} 
Event: {'type': 'updates', 'ns': (), 'data': {'node2': {'messages': [{'role': 'assistant', 'content': '测试消息2'}], 'log_info': {'node2': 'node2 执行日志'}}}} 
node4 state: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='9054f0d8-b9f1-4946-9022-e18b4e837ff1'), {'role': 'assistant', 'content': '测试消息1'}, {'role': 'assistant', 'content': '测试消息2'}], 'log_info': {'node2': 'node2 执行日志'}} 
Event: {'type': 'updates', 'ns': (), 'data': {'node4': {'log_info': {'node4': 'node4 执行日志'}}}}
  • 子图调用astream产生了四个"chunk", 但是它们并非模型流式输出的结果, 而是对应提问->分析+工具调用->工具结果返回->答案四步所对应消息列表
  • 子图的astream事件为messages的信息并不会被父图所捕获, 需要开启subgraph=True才能被捕获
  • 观察log_info可知, 在未指定reduce策略的情况下默认的策略是覆盖

3. 内部节点使用ainvoke/invoke

async def wrapped_worker_node(state: State, config: RunnableConfig):
    # --- [前置操作] ---
    print(f"[Pre-processing] Starting worker for user input...")
    # 你可以在这里修改 state,或者记录日志到外部系统
    
    print(" [Streaming] Starting to non-stream worker output...")
    result = await worker_agent.ainvoke(state, config=config)
    print("异步非流式调用结果:", result)
    content = result["messages"][-1]
    print(f"回复内容: {content}")

    # --- [后置操作] ---
    print(f"[Post-processing] Worker finished.")
    return {"messages": [content]}

构建graph

inputs: State = {"messages": [HumanMessage(content="你好,请调用工具查询天气")], "log_info": {}}
builder1 = StateGraph(State)
builder1.add_node("wrapped_worker", wrapped_worker_node)
builder1.add_node("node1", node1)
builder1.add_node("node2", node2)


builder1.add_edge(START, "wrapped_worker")
builder1.add_edge("wrapped_worker", "node1")
builder1.add_edge("node1", "node2")
builder1.add_edge("node2", END)
graph1 = builder1.compile()


async for event in graph1.astream(inputs, config, stream_mode=["messages"], subgraphs=True, version="v2"):
    print(f"Event: {event}")

测试结果:

[Pre-processing] Starting worker for user input...
 [Streaming] Starting to non-stream worker output...
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[{'name': 'get_weather', 'args': {}, 'id': 'call_1bfd6aa26a96419fb39473', 'type': 'tool_call'}], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[{'name': 'get_weather', 'args': '{"', 'id': 'call_1bfd6aa26a96419fb39473', 'index': 0, 'type': 'tool_call_chunk'}]), {'thread_id': '1', 'langgraph_step': 1, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:be587879-f68d-e1c0-2949-a521b03e8998', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[], invalid_tool_calls=[{'name': None, 'args': 'query": "北京', 'id': '', 'error': None, 'type': 'invalid_tool_call'}], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[{'name': None, 'args': 'query": "北京', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]), {'thread_id': '1', 'langgraph_step': 1, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:be587879-f68d-e1c0-2949-a521b03e8998', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[], invalid_tool_calls=[{'name': None, 'args': '"}', 'id': '', 'error': None, 'type': 'invalid_tool_call'}], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[{'name': None, 'args': '"}', 'id': '', 'index': 0, 'type': 'tool_call_chunk'}]), {'thread_id': '1', 'langgraph_step': 1, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:be587879-f68d-e1c0-2949-a521b03e8998', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'finish_reason': 'tool_calls', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 1, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:be587879-f68d-e1c0-2949-a521b03e8998', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={}, id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[], invalid_tool_calls=[], tool_call_chunks=[], chunk_position='last'), {'thread_id': '1', 'langgraph_step': 1, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:be587879-f68d-e1c0-2949-a521b03e8998', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (ToolMessage(content='Result for 北京: 晴', name='get_weather', id='1737ec2f-c212-40a1-883e-60b20a225c7b', tool_call_id='call_1bfd6aa26a96419fb39473'), {'thread_id': '1', 'langgraph_step': 2, 'langgraph_node': 'tools', 'langgraph_triggers': ('__pregel_push',), 'langgraph_path': ('__pregel_push', 0, False), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|tools:1d5f8254-c71c-f319-a192-88340feb3d7d', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98'})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='北京', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='的', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='天气是晴天', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='。', additional_kwargs={}, response_metadata={'model_provider': 'openai'}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={'finish_reason': 'stop', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}, tool_call_chunks=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
Event: {'type': 'messages', 'ns': ('wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98',), 'data': (AIMessageChunk(content='', additional_kwargs={}, response_metadata={}, id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], tool_call_chunks=[], chunk_position='last'), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'model', 'langgraph_triggers': ('branch:to:model',), 'langgraph_path': ('__pregel_pull', 'model'), 'langgraph_checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98|model:30222467-3eac-b9eb-a00f-c7cd96973d3e', 'checkpoint_ns': 'wrapped_worker:fba88753-0f22-422e-e99a-d5d6e2350b98', 'ls_provider': 'openai', 'ls_model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'ls_model_type': 'chat', 'ls_temperature': None})}
异步非流式调用结果: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='3a8d41ca-8aba-401b-b88f-fd9a22c1209d'), AIMessage(content='', additional_kwargs={}, response_metadata={'finish_reason': 'tool_calls', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5bd3-b724-7440-8f2b-3fe632423a69', tool_calls=[{'name': 'get_weather', 'args': {'query': '北京'}, 'id': 'call_1bfd6aa26a96419fb39473', 'type': 'tool_call'}], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}}), ToolMessage(content='Result for 北京: 晴', name='get_weather', id='1737ec2f-c212-40a1-883e-60b20a225c7b', tool_call_id='call_1bfd6aa26a96419fb39473'), AIMessage(content='北京的天气是晴天。', additional_kwargs={}, response_metadata={'finish_reason': 'stop', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}})]}
回复内容: 北京的天气是晴天。
[Post-processing] Worker finished.
node2 state: {'messages': [HumanMessage(content='你好,请调用工具查询天气', additional_kwargs={}, response_metadata={}, id='3a8d41ca-8aba-401b-b88f-fd9a22c1209d'), AIMessage(content='北京的天气是晴天。', additional_kwargs={}, response_metadata={'finish_reason': 'stop', 'model_name': 'Qwen/Qwen3-Next-80B-A3B-Instruct', 'model_provider': 'openai'}, name='worker', id='lc_run--019d5bd3-ba57-74c2-96c7-001554555f9c', tool_calls=[], invalid_tool_calls=[], usage_metadata={'input_tokens': 0, 'output_tokens': 0, 'total_tokens': 0, 'input_token_details': {}, 'output_token_details': {}})], 'log_info': {'node1': 'node1 执行日志'}}
Event: {'type': 'messages', 'ns': (), 'data': (AIMessage(content='测试消息2', additional_kwargs={}, response_metadata={}, id='ef979343-67e1-4081-a75f-3803d0b6954f', tool_calls=[], invalid_tool_calls=[]), {'thread_id': '1', 'langgraph_step': 3, 'langgraph_node': 'node2', 'langgraph_triggers': ('branch:to:node2',), 'langgraph_path': ('__pregel_pull', 'node2'), 'langgraph_checkpoint_ns': 'node2:f236c5a1-e972-bf00-0f4e-9300ead1eb69'})}
  • 可以观察到即使内部使用ainvoke, 父图期望流式获取大模型输出时, 只要设置捕获子图, 仍然可以正常获取MessageChunk