langchain (Components) v0.2文档(16)如何一起撰写提示(中英对照)

85 阅读3分钟

This guide assumes familiarity with the following concepts:

LangChain provides a user friendly interface for composing different parts of prompts together. You can do this with either string prompts or chat prompts. Constructing prompts this way allows for easy reuse of components.

LangChain 提供了一个用户友好的界面,用于将提示的不同部分组合在一起。您可以使用字符串提示或聊天提示来执行此操作。以这种方式构建提示可以轻松地重用组件。

String prompt composition

When working with string prompts, each template is joined together. You can work with either prompts directly or strings (the first element in the list needs to be a prompt).

使用字符串提示时,每个模板都会连接在一起。您可以直接使用提示或字符串(列表中的第一个元素必须是提示)

from langchain_core.prompts import PromptTemplate

prompt = (
    PromptTemplate.from_template("Tell me a joke about {topic}")
    + ", make it funny"
    + "\n\nand in {language}"
)

prompt

API Reference: PromptTemplate

PromptTemplate(input_variables=['language', 'topic'], template='Tell me a joke about {topic}, make it funny\n\nand in {language}')
prompt.format(topic="sports", language="spanish")

'Tell me a joke about sports, make it funny\n\nand in spanish'

Chat prompt composition

A chat prompt is made up a of a list of messages. Similarly to the above example, we can concatenate chat prompt templates. Each new element is a new message in the final prompt.

聊天提示由消息列表组成。与上面的例子类似,我们可以串联聊天提示模板。每个新元素都是最终提示中的一条新消息。

First, let's initialize the a ChatPromptTemplate with a SystemMessage.

首先,让我们用 SystemMessage 初始化 ChatPromptTemplate

from langchain_core.messages import AIMessage, HumanMessage, SystemMessage

prompt = SystemMessage(content="You are a nice pirate")

API Reference: AIMessage | HumanMessage | SystemMessage

You can then easily create a pipeline combining it with other messages or message templates. Use a Message when there is no variables to be formatted, use a MessageTemplate when there are variables to be formatted. You can also use just a string (note: this will automatically get inferred as a HumanMessagePromptTemplate.)

然后,您可以轻松创建一个管道,将其与其他消息或消息模板相结合。当没有要格式化的变量时,使用消息;当有要格式化的变量时,使用消息模板。您也可以只使用字符串(注意:这将自动推断为 HumanMessagePromptTemplate。)

new_prompt = (
    prompt + HumanMessage(content="hi") + AIMessage(content="what?") + "{input}"
)

Under the hood, this creates an instance of the ChatPromptTemplate class, so you can use it just as you did before!

在底层,这会创建 ChatPromptTemplate 类的一个实例,因此您可以像以前一样使用它!

new_prompt.format_messages(input="i said hi")

[SystemMessage(content='You are a nice pirate'), HumanMessage(content='hi'), AIMessage(content='what?'), HumanMessage(content='i said hi')]

Using PipelinePrompt

LangChain includes a class called PipelinePromptTemplate, which can be useful when you want to reuse parts of prompts. A PipelinePrompt consists of two main parts:

LangChain 包含一个名为 PipelinePromptTemplate 的类,当您想要重复使用提示的部分内容时,它会很有用。PipelinePrompt 由两个主要部分组成:

  • Final prompt: The final prompt that is returned
  • 最终提示:返回的最终提示
  • Pipeline prompts: A list of tuples, consisting of a string name and a prompt template. Each prompt template will be formatted and then passed to future prompt templates as a variable with the same name.
  • 管道提示:元组列表,由字符串名称和提示模板组成。每个提示模板将被格式化,然后作为具有相同名称的变量传递到未来的提示模板。
from langchain_core.prompts import PipelinePromptTemplate, PromptTemplate

full_template = """{introduction}

{example}

{start}"""
full_prompt = PromptTemplate.from_template(full_template)

introduction_template = """You are impersonating {person}."""
introduction_prompt = PromptTemplate.from_template(introduction_template)

example_template = """Here's an example of an interaction:

Q: {example_q}
A: {example_a}"""
example_prompt = PromptTemplate.from_template(example_template)

start_template = """Now, do this for real!

Q: {input}
A:"""
start_prompt = PromptTemplate.from_template(start_template)

input_prompts = [
    ("introduction", introduction_prompt),
    ("example", example_prompt),
    ("start", start_prompt),
]
pipeline_prompt = PipelinePromptTemplate(
    final_prompt=full_prompt, pipeline_prompts=input_prompts
)

pipeline_prompt.input_variables

API Reference: PipelinePromptTemplate | PromptTemplate

['person', 'example_a', 'example_q', 'input']
print(
    pipeline_prompt.format(
        person="Elon Musk",
        example_q="What's your favorite car?",
        example_a="Tesla",
        input="What's your favorite social media site?",
    )
)

You are impersonating Elon Musk.Here's an example of an interaction:Q: What's your favorite car?A: TeslaNow, do this for real!Q: What's your favorite social media site?A: