langchain v0.2文档(7)如何将参数从一个步骤传递到下一个步骤(中英对照)

70 阅读2分钟

When composing chains with several steps, sometimes you will want to pass data from previous steps unchanged for use as input to a later step. The RunnablePassthrough class allows you to do just this, and is typically is used in conjuction with a RunnableParallel to pass data through to a later step in your constructed chains.

当组成包含多个步骤的链时,有时您需要传递来自先前步骤的数据,而不做任何更改,以用作后续步骤的输入。RunnablePassthrough 类允许您执行此操作,并且通常与 RunnableParallel 结合使用,以将数据传递到您构建的链中的后续步骤。

See the example below:

%pip install -qU langchain langchain-openai

import os
from getpass import getpass

os.environ["OPENAI_API_KEY"] = getpass()
from langchain_core.runnables import RunnableParallel, RunnablePassthrough

runnable = RunnableParallel(
    passed=RunnablePassthrough(),
    modified=lambda x: x["num"] + 1,
)

runnable.invoke({"num": 1})

API Reference: RunnableParallel | RunnablePassthrough

{'passed': {'num': 1}, 'modified': 2}
As seen above, passed key was called with RunnablePassthrough() and so it simply passed on {'num': 1}.

We also set a second key in the map with modified. This uses a lambda to set a single value adding 1 to the num, which resulted in modified key with the value of 2.

Retrieval Example 检索示例

In the example below, we see a more real-world use case where we use RunnablePassthrough along with RunnableParallel in a chain to properly format inputs to a prompt:

在下面的示例中,我们看到了一个更真实的用例,我们在链中使用 RunnablePassthrough 和 RunnableParallel 来正确格式化提示的输入:

from langchain_community.vectorstores import FAISS
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_openai import ChatOpenAI, OpenAIEmbeddings

vectorstore = FAISS.from_texts(
    ["harrison worked at kensho"], embedding=OpenAIEmbeddings()
)
retriever = vectorstore.as_retriever()
template = """Answer the question based only on the following context:
{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)
model = ChatOpenAI()

retrieval_chain = (
    {"context": retriever, "question": RunnablePassthrough()}
    | prompt
    | model
    | StrOutputParser()
)

retrieval_chain.invoke("where did harrison work?")

API Reference: FAISS | StrOutputParser | ChatPromptTemplate | RunnablePassthrough | ChatOpenAI | OpenAIEmbeddings

'Harrison worked at Kensho.'

Here the input to prompt is expected to be a map with keys "context" and "question". The user input is just the question. So we need to get the context using our retriever and passthrough the user input under the "question" key. The RunnablePassthrough allows us to pass on the user's question to the prompt and model.

这里,提示的输入应为带有“上下文”和“问题”键的映射。用户输入就是问题。因此,我们需要使用检索器获取上下文,并将用户输入传递到“问题”键下。RunnablePassthrough 允许我们将用户的问题传递给提示和模型。