概要
在网上找了很久都没找到deepseek输出流式并且输出思考内容,今天我就分享给大家。
整体架构流程
1.获取deepseek api
2.使用流式输出
技术名词解释
deepseek
技术细节
附上完整代码
import json
import os
from openai import OpenAI
from config.path_config import path_config
def deepseek_msg(query: str, ws=None):
api_key = os.environ.get("DEEPSEEK_API_KEY", path_config.DEEPSEEK_API_KEY)
client = OpenAI(api_key=api_key, base_url=path_config.DEEPSEEK_URL)
messages = [{'role': 'user', 'content': query}]
try:
response = client.chat.completions.create(
model="deepseek-reasoner",
messages=messages,
stream=True
)
# 变量初始化
content = ""
reasoning_content = ""
for chunk in response:
if chunk.choices and chunk.choices[0].delta:
delta = chunk.choices[0].delta
if delta.reasoning_content:
reasoning_content += delta.reasoning_content
send_content(ws, {"thought": delta.reasoning_content})
if delta.content:
content += delta.content
send_content(ws, {"content": delta.content})
return content, reasoning_content
except Exception as e:
error_msg = f"Error in DeepSeek API call: {str(e)}"
print(error_msg)
if ws:
ws.send(json.dumps({"error": error_msg}, ensure_ascii=False))
return error_msg
def send_content(ws, content_dict):
"""发送"""
if ws:
ws.send(json.dumps(content_dict, ensure_ascii=False))
else:
print(json.dumps(content_dict, ensure_ascii=False))
if __name__ == "__main__":
userPrompt = """你好"""
deepseek_msg(query=userPrompt)
输出:
D:\AI\conda\envs\langchain\python.exe -X pycache_prefix=C:\Users\yuanzhuo\AppData\Local\JetBrains\PyCharm2024.1\cpython-cache "D:/Pycharm/PyCharm 2024.1.6/plugins/python/helpers/pydev/pydevd.py" --multiprocess --qt-support=auto --client 127.0.0.1 --port 60150 --file D:\Pycharm\aiAgent\bus_2015\bus\model_init\deepseek_stream.py
Connected to pydev debugger (build 241.19072.16)
{"thought": "您好"}
{"thought": "!"}
{"thought": "很高兴"}
{"thought": "为您"}
{"thought": "提供服务"}
{"thought": "。"}
{"thought": "您"}
{"thought": "有什么"}
{"thought": "问题"}
{"thought": "或"}
{"thought": "需要"}
{"thought": "帮助"}
{"thought": "的地方"}
{"thought": "吗"}
{"thought": "?"}
{"thought": "无论是"}
{"thought": "学习"}
{"thought": "、"}
{"thought": "工作"}
{"thought": "还是"}
{"thought": "生活中的"}
{"thought": "疑问"}
{"thought": ","}
{"thought": "我"}
{"thought": "都很"}
{"thought": "乐意"}
{"thought": "协助"}
{"thought": "您"}
{"thought": "解决"}
{"thought": "。"}
{"thought": "请"}
{"thought": "随时"}
{"thought": "告诉我"}
{"thought": "您的"}
{"thought": "需求"}
{"thought": ","}
{"thought": "我会"}
{"thought": "尽力"}
{"thought": "提供"}
{"thought": "详细的"}
{"thought": "解答"}
{"thought": "和"}
{"thought": "指导"}
{"thought": "。"}
{"content": "您好"}
{"content": "!"}
{"content": "很高兴"}
{"content": "为您"}
{"content": "提供服务"}
{"content": "。"}
{"content": "您"}
{"content": "有什么"}
{"content": "问题"}
{"content": "或"}
{"content": "需要"}
{"content": "帮助"}
{"content": "的地方"}
{"content": "吗"}
{"content": "?"}
{"content": "无论是"}
{"content": "学习"}
{"content": "、"}
{"content": "工作"}
{"content": "还是"}
{"content": "生活中的"}
{"content": "疑问"}
{"content": ","}
{"content": "我"}
{"content": "都很"}
{"content": "乐意"}
{"content": "协助"}
{"content": "您"}
{"content": "解决"}
{"content": "。"}
{"content": "请"}
{"content": "随时"}
{"content": "告诉我"}
{"content": "您的"}
{"content": "需求"}
{"content": ","}
{"content": "我会"}
{"content": "尽力"}
{"content": "提供"}
{"content": "详细的"}
{"content": "解答"}
{"content": "和"}
{"content": "指导"}
{"content": "。"}
Process finished with exit code 0
接下来讲讲代码细节,主要是设置 stream=True,model=“deepseek-reasoner” model必须为deepseek-reasoner就是我们有思考模式的deepseek-r1,接下来就是delta.reasoning_content和delta.content,reasoning_content就是思考的内容,content就是最终回答,其实非常简单,但是不知道为什么网上却找不到案例。
小结
总结:根据网上的流式输出,替换model为deepseek-reasoner,且输出delta.reasoning_content和delta.content,再加上websocket的使用就可以实现流失输出的功能啦