LangChain是什么
LangChain是一个agent工程的平台。官网原话为:LangChain is the platform for agent engineering
LangChain 目前有两个开源框架:
- LangChain: 可以帮助你快速的构建agent, 可以选择任何模型的提供商。
- LangGraph:允许你通过底层编排、记忆存储,人工交互的方式控制自定义agnet的每一步(原文:allows you to control every step of your custom agent with low-level orchestration, memory, and human-in-the-loop support)
LangChain更新迭代
目前大模型技术更新迭代非常快,导致LangChain的开源框架也在一直的迭代更新。目前LangChain 跟 LangGraphm目前都已经迭代到v1.0版本了。
LangChain教程
LangChain github的仓库:https://github.com/langchain-ai/langchain
- 安装依赖包
pip install -U langchain
- 安装适配的大语言模型库
pip install -U langchain
pip install langchain-deepseek
pip install openai
创建第一个agent
from langchain.agents import create_agent
import os
# 去deepseek 官网购买tokern,并将DEEPSEEK_API_KEY设置到环境变量中
os.environ['DEEPSEEK_API_KEY'] = 'xxxxxxx'
# function call 回掉查询天气的工具
def get_weather(city: str) -> str:
"""get weather data for a given city"""
return f"It's always sunny in {city}"
agent = create_agent(
model='deepseek-chat',
tools=[get_weather],
system_prompt='You are a helpful assistant',
)
message = agent.invoke(
{
"message": [{"role": "user", "content": "what is the weather in sf"}]
}
)
print(message['messages'][0].content)
建立真实世界Agent
创建一个实用的天气预报agent需要以下核心的几个步骤:
1.Detailed system prompts for better agent behavior
2.Create tools that integrate with external data
3.Model configuration for consistent responses
4.Structured output for predictable results
5.Conversational memory for chat-like interactions
6.Create and run the agent create a fully functional agent

快速实操:
- 详细的系统提示词让agent的角色跟行为更好
SYSTEM_PROMPT = """You are an expert weather forecaster, who speaks in puns.
You have access to two tools:
- get_weather_for_location: use this to get the weather for a specific location
- get_user_location: use this to get the user's location
If a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean wherever they are, use the get_user_location tool to find their location."""
- 创建工具
from dataclasses import dataclass
from typing import Any
from langchain.tools import tool, ToolRuntime
@tool
def get_weather_for_loction(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}"
@dataclass
class Context:
"""Custom runtime context schema."""
user_id: str
@tool
def get_user_location(runtime: ToolRuntime[Context, Any]) -> str:
"""Retrieve user information based on user ID."""
user_id = runtime.context.user_id
return f"Florida" if user_id == "1" else "SF"
- 配置模型
model = init_chat_model(
"deepsee_chat",
temperature=0.5,
timeout=10,
max_tokens=1000
)
- 定义返回格式
@dataclass
class ResponseFormat:
"""Response schema for the agent."""
# A punny response (always required)
punny_response: str
# Any interesting information about the weather if available
weather_conditions: str | None = None
- 会话记忆
from langgraph.checkpoint.memory import InMemorySaver
checkpointer = InMemorySaver()
- 创建并运行agent
from langchain.agents import create_agent
agent = create_agent(
model=model,
system_prompt=SYSTEM_PROMPT,
tools=[get_weather_for_loction, get_user_location],
context_schema=Context,
response_format=ResponseFormat,
checkpointer=checkpointer
)
config = {"configurable": {"thread_id": "1"}}
response = agent.invoke(
{"messages": [{"role": "user", "content": "what is the weather outside?"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
response = agent.invoke(
{"messages": [{"role": "user", "content": "thank you"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
完整demo事例
import os
from dataclasses import dataclass
from typing import Any
from langchain.agents import create_agent
from langchain.tools import tool, ToolRuntime
from langchain.chat_models import init_chat_model
from langgraph.checkpoint.memory import InMemorySaver
os.environ['DEEPSEEK_API_KEY'] = 'xxxxxxxxxxxxx'
SYSTEM_PROMPT = """You are an expert weather forecaster, who speaks in puns.
You have access to two tools:
- get_weather_for_location: use this to get the weather for a specific location
- get_user_location: use this to get the user's location
If a user asks you for the weather, make sure you know the location. If you can tell from the question that they mean wherever they are, use the get_user_location tool to find their location."""
@tool
def get_weather_for_loction(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}"
@dataclass
class Context:
"""Custom runtime context schema."""
user_id: str
@tool
def get_user_location(runtime: ToolRuntime[Context, Any]) -> str:
"""Retrieve user information based on user ID."""
user_id = runtime.context.user_id
return f"Florida" if user_id == "1" else "SF"
@dataclass
class ResponseFormat:
"""Response schema for the agent."""
# A punny response (always required)
punny_response: str
# Any interesting information about the weather if available
weather_conditions: str | None = None
model = init_chat_model(
"deepseek-chat",
temperature=0.5,
timeout=10,
max_tokens=1000
)
checkpointer = InMemorySaver()
agent = create_agent(
model=model,
system_prompt=SYSTEM_PROMPT,
tools=[get_weather_for_loction, get_user_location],
context_schema=Context,
response_format=ResponseFormat,
checkpointer=checkpointer
)
config = {"configurable": {"thread_id": "1"}}
response = agent.invoke(
{"messages": [{"role": "user", "content": "what is the weather outside?"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
response = agent.invoke(
{"messages": [{"role": "user", "content": "thank you"}]},
config=config,
context=Context(user_id="1")
)
print(response['structured_response'])
876

被折叠的 条评论
为什么被折叠?



