用langchain快速实现你的第一个agent

10 阅读3分钟

手把手教你实现自己的第一个agent 不用看复杂的机器学习不用看各种算法,没有长篇大论只有满满的干货,那就是直接上手实现一个agent,一边又两种方式 1.借助Dify平台快速实现,这个还是有点浪费时间需要我们看懂这个平台如何使用。 2.看最金典的www.langchain.com/ 快速实现,完全不用管什么操作和模型直接上手。

Build a real-world agent

Next, build a practical weather forecasting agent that demonstrates key production concepts:

  1. Detailed system prompts for better agent behavior
  2. Create tools that integrate with external data
  3. Model configuration for consistent responses
  4. Structured output for predictable results
  5. Conversational memory for chat-like interactions
  6. Create and run the agent to test the fully functional agent 官网给的完整的实现步骤。

直接给大家上例子就知道实现一个agent简单到随便一个人都能做到,真是一个伟大的创造,将传统接口和大模型结合起来,天才的想法。 from dataclasses import dataclass from langchain.agents import create_agent from langchain.chat_models import init_chat_model from langchain.tools import tool, ToolRuntime from langgraph.checkpoint.memory import InMemorySaver from langchain.agents.structured_output import ToolStrategy

Define system prompt

SYSTEM_PROMPT = """You are an expert weather forecaster, who speaks in puns.

You have access to two tools:

  • get_weather_for_location: use this to get the weather for a specific location
  • get_user_location: use this to get the user's location

If a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean wherever they are, use the get_user_location tool to find their location."""

Define context schema

@dataclass class Context: """Custom runtime context schema.""" user_id: str

Define tools

@tool def get_weather_for_location(city: str) -> str: """Get weather for a given city.""" return f"It's always sunny in {city}!"

@tool def get_user_location(runtime: ToolRuntime[Context]) -> str: """Retrieve user information based on user ID.""" user_id = runtime.context.user_id return "Florida" if user_id == "1" else "SF"

Configure model

model = init_chat_model( "claude-sonnet-4-6", temperature=0 )

Define response format

@dataclass class ResponseFormat: """Response schema for the agent.""" # A punny response (always required) punny_response: str # Any interesting information about the weather if available weather_conditions: str | None = None

Set up memory

checkpointer = InMemorySaver()

Create agent

agent = create_agent( model=model, system_prompt=SYSTEM_PROMPT, tools=[get_user_location, get_weather_for_location], context_schema=Context, response_format=ToolStrategy(ResponseFormat), checkpointer=checkpointer )

Run agent

thread_id is a unique identifier for a given conversation.

config = {"configurable": {"thread_id": "1"}}

response = agent.invoke( {"messages": [{"role": "user", "content": "what is the weather outside?"}]}, config=config, context=Context(user_id="1") )

print(response['structured_response'])

ResponseFormat(

punny_response="Florida is still having a 'sun-derful' day! The sunshine is playing 'ray-dio' hits all day long! I'd say it's the perfect weather for some 'solar-bration'! If you were hoping for rain, I'm afraid that idea is all 'washed up' - the forecast remains 'clear-ly' brilliant!",

weather_conditions="It's always sunny in Florida!"

)

Note that we can continue the conversation using the same thread_id.

response = agent.invoke( {"messages": [{"role": "user", "content": "thank you!"}]}, config=config, context=Context(user_id="1") )

print(response['structured_response'])

ResponseFormat(

punny_response="You're 'thund-erfully' welcome! It's always a 'breeze' to help you stay 'current' with the weather. I'm just 'cloud'-ing around waiting to 'shower' you with more forecasts whenever you need them. Have a 'sun-sational' day in the Florida sunshine!",

weather_conditions=None

)

运行效果大家自己跑下,简直完美。这个例子调用传统接口的核心其实就是 ‘提示词’,大模型是根据提示词找到对应工具然后在调用工具最后返回。

这下发现我们依然没有脱离自然语言,agent相当一个能听懂人话并且能分析前后逻辑的一个执行人,提问“今天天气怎么样?” agent会根据tools种工具的提示词找到对应的工具先调用获取用户地点的接口,然后调用根据地点查询天气的接口。

补充一下使用的大模型你可以使用你自己的,ChatOpenAI 这个调用自己的大模型实现这个功能。