MCP官方文档:https://modelcontextprotocol.io/introduction |
一、是什么?
引用官方文档对MCP的定义
MCP (Model Context Protocol)是一种开放协议,它标准化了应用程序向 LLM 提供上下文的方式。可以将 MCP 视为 AI 应用程序的 USB-C 端口。正如 USB-C 提供了一种将设备连接到各种外围设备和配件的标准化方式一样,MCP 提供了一种将 AI 模型连接到不同数据源和工具的标准化方式。 MCP 是 Claude (Anthropic) 主导发布的一个开放的、通用的、有共识的协议标准。 |
本质上,定义了模型和各个tools的连接规范,并且提供了SDK。让各个工具以指定的方式提供服务接口。
这里是clinet给模型发送的请求,实际上,就是多构建了tools这部分内容。
json { "request_id": "2025031017252285", "timestamp": "2025-03-10T17:25:22.085602", "method": "POST", "path": "v1/chat/completions", "ip": "127.0.0.1", "headers": { "Host": "127.0.0.1:9494", "User-Agent": "AsyncOpenAI/Python 1.65.5", "Content-Length": "1139", "Accept": "application/json", "Accept-Encoding": "gzip, deflate, zstd", "Authorization": "", "Content-Type": "application/json", "Http-Referer": "https://github.com/adhikasp/mcp-client-cli", "X-Stainless-Arch": "x64", "X-Stainless-Async": "async:asyncio", "X-Stainless-Lang": "python", "X-Stainless-Os": "MacOS", "X-Stainless-Package-Version": "1.65.5", "X-Stainless-Retry-Count": "0", "X-Stainless-Runtime": "CPython", "X-Stainless-Runtime-Version": "3.12.9", "X-Title": "mcp-client-cli" }, "data": { "messages": [ { "content": "You are an AI assistant helping a software engineer...", "role": "system" }, { "content": "What is the capital city of North Sumatra?", "role": "user" } ], "model": "gpt-4o-2024-11-20", "stream": true, "temperature": 0.7, "tools": [ { "type": "function", "function": { "name": "get_alerts", "description": "Get weather alerts for a US state.\n\n Args:\n state: Two-letter US state code (e.g. CA, NY)\n ", "parameters": { "properties": { "state": { "type": "string" } }, "required": [ "state" ], "type": "object" } } }, { "type": "function", "function": { "name": "get_forecast", "description": "Get weather forecast for a location.\n\n Args:\n latitude: Latitude of the location\n longitude: Longitude of the location\n ", "parameters": { "properties": { "latitude": { "type": "number" }, "longitude": { "type": "number" } }, "required": [ "latitude", "longitude" ], "type": "object" } } }, { "type": "function", "function": { "name": "save_memory", "description": "Save the given memory for the current user. Do not save duplicate memories.", "parameters": { "properties": { "memories": { "items": { "type": "string" }, "type": "array" } }, "required": [ "memories" ], "type": "object" } } } ] } } |
1.1 总体架构

简单来说,MCP服务器就是一个
一个的tool,例如执行联网查询、操作本地文件等等
- MCP 主机:希望通过 MCP 访问数据的程序,例如 Claude Desktop、IDE 或 AI 工具(这里我的理解是应用程序。MCP设计之初就是给claude提供外部接口能力的)
- MCP 客户端:与服务器保持 1:1 连接的协议客户端
- MCP 服务器:轻量级程序,每个程序都通过标准化模型上下文协议公开特定功能
- 本地数据源:MCP 服务器可以安全访问的您的计算机文件、数据库和服务
- 远程服务:MCP 服务器可通过互联网(例如通过 API)连接到的外部系统
这张图更形象一些:

(图片来源:https://mp.weixin.qq.com/s?__biz=MjM5NDkxMTgyNw==&mid=2653082498&idx=1&sn=9b8c7ebeef5c6fd2fd7a1804d810af07&chksm=bc349c568467b272befabc3e385ae47300b8b57a5f85d3024e00da2ad0c40f25eaa40e6562ac#rd)
1.2 MCP中的核心概念
1.2.1 MCP server
FastMCP 服务器是 MCP 协议的核心接口。它负责连接管理、协议合规性和消息路由:
python # Add lifespan support for startup/shutdown with strong typing from dataclasses import dataclass from typing import AsyncIterator from mcp.server.fastmcp import FastMCP # Create a named server mcp = FastMCP("My App") # Specify dependencies for deployment and development mcp = FastMCP("My App", dependencies=["pandas", "numpy"]) @dataclass class AppContext: db: Database # Replace with your actual DB type @asynccontextmanager async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]: """Manage application lifecycle with type-safe context""" try: # Initialize on startup await db.connect() yield AppContext(db=db) finally: # Cleanup on shutdown await db.disconnect() # Pass lifespan to server mcp = FastMCP("My App", lifespan=app_lifespan) # Access type-safe lifespan context in tools @mcp.tool() def query_db(ctx: Context) -> str: """Tool that uses initialized resources""" db = ctx.request_context.lifespan_context["db"] return db.query() |
1.2.2 资源
向LLM公开数据
python @mcp.resource("config://app") def get_config() -> str: """Static configuration data""" return "App configuration here" @mcp.resource("users://{user_id}/profile") def get_user_profile(user_id: str) -> str: """Dynamic user data""" return f"Profile data for user {user_id}" |
1.2.3 工具
具体的做某个事情
python @mcp.tool() def calculate_bmi(weight_kg: float, height_m: float) -> float: """Calculate BMI given weight in kg and height in meters""" return weight_kg / (height_m ** 2) @mcp.tool() async def fetch_weather(city: str) -> str: """Fetch current weather for a city""" async with httpx.AsyncClient() as client: response = await client.get(f"https://api.weather.com/{city}") return response.text |
1.2.4 Prompt
python @mcp.prompt() def review_code(code: str) -> str: return f"Please review this code:\n\n{code}" @mcp.prompt() def debug_error(error: str) -> list[Message]: return [ UserMessage("I'm seeing this error:"), UserMessage(error), AssistantMessage("I'll help debug that. What have you tried so far?") ] |
1.2.5 图片
FastMCP 提供了一个Image自动处理图像数据的类:
python from mcp.server.fastmcp import FastMCP, Imagefrom PIL import Image as PILImage@mcp.tool()def create_thumbnail(image_path: str) -> Image: """Create a thumbnail from an image"""img = PILImage.open(image_path) img.thumbnail((100, 100)) return Image(data=img.tobytes(), format="png") |
1.2.6 Content
可以访问文件
python from mcp.server.fastmcp import FastMCP, Context @mcp.tool() async def long_task(files: list[str], ctx: Context) -> str: """Process multiple files with progress tracking""" for i, file in enumerate(files): ctx.info(f"Processing {file}") await ctx.report_progress(i, len(files)) data, mime_type = await ctx.read_resource(f"file://{file}") return "Processing complete" |
1.3 通信方式
1.3.1 标准输入/输出 (stdio)
在MCP环境中,stdio提供了一种让模型与本地系统直接交互的方式,非常适合将AI模型作为命令行工具的一部分,或集成到现有的基于命令行的工作流程中。这种机制使得模型可以像传统Unix工具一样参与数据管道,成为更大系统中的一个组件。
stdio 传输支持通过标准输入和输出流进行通信。这对于本地集成和命令行工具特别有用。
在以下情况下使用 stdio:
python app = Server("example-server") async with stdio_server() as streams: await app.run( streams[0], streams[1], app.create_initialization_options() ) |
1.3.2 服务器发送事件 (SSE)
SSE 传输通过 HTTP POST 请求实现服务器到客户端的流式传输,从而实现客户端到服务器的通信。
在以下情况下使用 SSE:
python from mcp.server.sse import SseServerTransport from starlette.applications import Starlette from starlette.routing import Route app = Server("example-server") sse = SseServerTransport("/messages") async def handle_sse(scope, receive, send): async with sse.connect_sse(scope, receive, send) as streams: await app.run(streams[0], streams[1], app.create_initialization_options()) async def handle_messages(scope, receive, send): await sse.handle_post_message(scope, receive, send) starlette_app = Starlette( routes=[ Route("/sse", endpoint=handle_sse), Route("/messages", endpoint=handle_messages, methods=["POST"]), ] ) |
二、能做什么?解决了什么问题?
MCP带来了一种规范的设计模式。在此之前,大家做agent,可能都需要自己去实现一堆的工具,并且高度集成在自己的代码中。CMP以server-client的设计模式,将工具和应用程序解耦合。
一种开放协议, 让大家都能按此要求去实现工具。解决了重复造轮子的问题。全面拥抱开源,借助开源的力量(会有很多人基于MCP协议,实现好一些工具,这些对我们来说可以直接使用)改变我们的工作模式,在此之前我们像做某个东西,肯定是要想如何通过代码实现,如果拥抱MCP这种模式,我们可以在MCP服务列表中,找找有没有这个功能的实现。可以有效的降低开发的成本(不过这里是理想情况下, 我觉得一定会遇到一个问题:开源的工具效果达不到,很有可能还需要我们自己开发)。
替换碎片化的 Agent 代码集成,从而使 AI 系统更可靠,更有效。通过建立通用标准,服务商可以基于协议来推出它们自己服务的 AI 能力,从而支持开发者更快地构建更强大的 AI 应用。
三、怎么用?
官方提供了对应的SDK

参考python的实现的SDK的demo https://github.com/modelcontextprotocol/python-sdk |
3.1 引入依赖
3.2 MCP server示例
简单示例
python # server.py from mcp.server.fastmcp import FastMCP # Create an MCP server mcp = FastMCP("Demo") # Add an addition tool @mcp.tool() def add(a: int, b: int) -> int: """Add two numbers""" return a + b # Add a dynamic greeting resource @mcp.resource("greeting://{name}") def get_greeting(name: str) -> str: """Get a personalized greeting""" return f"Hello, {name}!" |
用来访问数据库的示例
python from mcp.server.fastmcp import FastMCP import sqlite3 mcp = FastMCP("SQLite Explorer") @mcp.resource("schema://main") def get_schema() -> str: """Provide the database schema as a resource""" conn = sqlite3.connect("database.db") schema = conn.execute( "SELECT sql FROM sqlite_master WHERE type='table'" ).fetchall() return "\n".join(sql[0] for sql in schema if sql[0]) @mcp.tool() def query_data(sql: str) -> str: """Execute SQL queries safely""" conn = sqlite3.connect("database.db") try: result = conn.execute(sql).fetchall() return "\n".join(str(row) for row in result) except Exception as e: return f"Error: {str(e)}" |
3.3 MCP client 示例
SDK 提供了用于连接 MCP 服务器的高级客户端接口:
python from mcp import ClientSession, StdioServerParameters from mcp.client.stdio import stdio_client # Create server parameters for stdio connection server_params = StdioServerParameters( command="python", # Executable args=["example_server.py"], # Optional command line arguments env=None # Optional environment variables ) # Optional: create a sampling callback async def handle_sampling_message(message: types.CreateMessageRequestParams) -> types.CreateMessageResult: return types.CreateMessageResult( role="assistant", content=types.TextContent( type="text", text="Hello, world! from model", ), model="gpt-3.5-turbo", stopReason="endTurn", ) async def run(): async with stdio_client(server_params) as (read, write): async with ClientSession(read, write, sampling_callback=handle_sampling_message) as session: # Initialize the connection await session.initialize() # List available prompts prompts = await session.list_prompts() # Get a prompt prompt = await session.get_prompt("example-prompt", arguments={"arg1": "value"}) # List available resources resources = await session.list_resources() # List available tools tools = await session.list_tools() # Read a resource content, mime_type = await session.read_resource("file://some/path") # Call a tool result = await session.call_tool("tool-name", arguments={"arg1": "value"}) if __name__ == "__main__": import asyncio asyncio.run(run()) |
四、有哪些MCP server
推荐看以下链接。对server做了分类,更方便查阅。
https://glama.ai/mcp/servers
4.1 官方提供的列表
https://github.com/modelcontextprotocol/servers

4.2 社区提供的列表
https://mcp.so/
学术搜索 MCP server(这个只提供了 arxiv的搜索)
https://github.com/adityak74/mcp-scholarly
4.3 如何使用这些server
python # With uvx uvx mcp-server-git # With pip pip install mcp-server-git python -m mcp_server_git |
五、有哪些MCP client
5.1 官方提供的列表
https://modelcontextprotocol.io/clients
5.2 个人实现的
https://github.com/adhikasp/mcp-client-cli
六、当前存在什么问题?
性能瓶颈:JSON-RPC 在处理大文件时效率较低。
安全风险:缺乏统一加密标准,存在数据泄露隐患。
技术门槛:需手动部署服务器,配置复杂度高
工具应该不是无限多的,因为需要把插件相关的描述信息,function,入参,出参数。都送给模型。
参考资源
这里是一些
https://github.com/punkpeye/awesome-mcp-servers/blob/main/README-zh.md#finance--fintech