函数调用
使模型能够获取数据并执行操作。
函数调用为大模型提供了一种强大而灵活的方式来与您的代码或外部服务进行交互
# Function will be used outside LLM
def get_weather(location: str, unit: str):
print(f"\nGetting the weather for {location} in {unit}...")
fake_weather_of_Dallas = "The weather is 98 degrees fahrenheit, with partlycloudy skies and a low chance of rain."
fake_weather_of_SF = "Clouds giving way to sun Hi: 76° Tonight: Mainly clear early, then areas of low clouds forming Lo: 56°"
if 'Francisco' in location :
return fake_weather_of_SF
elif 'Dallas' in location :
return fake_weather_of_Dallas
else :
return "unknowm city"
# Tool_call procedure
def tool_call_infer(client, model_name) :
# Prepare request-1 params
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state, e.g., 'San Francisco, CA'"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location", "unit"]
}
}
}]
tool_choice={"type": "function", "function": {"name": "get_weather"}}
city = "San Francisco"
# city = "Dallas"
chat_prompt = [{"role": "user", "content": f"What's the weather like in {city}?"}]
# Request-1
response = client.chat.completions.create(
model=model_name,
messages=chat_prompt,
temperature=0,
max_tokens=200,
tools=tools,
tool_choice=tool_choice
)
# Check response-1
tool_call_ret = response.choices[0].message.tool_calls[0]
print(f"Function called: {tool_call_ret.function.name}")
print(f"Arguments: {tool_call_ret.function.arguments}")
# Call function with response-1
weather_by_func = get_weather(**json.loads(tool_call_ret.function.arguments))
# Prepare request-2 params
# *system_prompt is needed in some case
system_prompt = {
"role": "system",
"content": "You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the orginal use question."
}
# append func call ret in request-2 params
func_ret_in_prompt = {
"role": "tool",
"content": weather_by_func,
"tool_call_id": tool_call_ret.id
}
chat_prompt.append(system_prompt)
chat_prompt.append(func_ret_in_prompt)
# Request-2
response = client.chat.completions.create(
messages=chat_prompt,
temperature=0,
max_tokens=200,
model=model_name,
# tools=tools,
# tool_choice="none"
)
print("\nFinal output : \n", response.choices[0].message.content)
if __name__ == "__main__":
model_name = "deepseek"
client = OpenAI(base_url="http://localhost:8888/v1", api_key="dummy")
tool_call_infer(client, model_name)
启动推理服务,并根据当前配置修改脚本最后的model_name和端口号(85、86行处);
服务启动后,通过python执行脚本,发起请求;
调用成功后会得到如下预期结果:
- 推理模型为DeepSeek-R1-AWQ
Function called: get_weather
Arguments: {"location": "San Francisco", "unit": "celsius"}
Getting the weather for San Francisco in celsius...
Final output :
Okay, the user asked about the weather in San Francisco. Let me check the tool response. The current condition is "Clouds giving way to sun" with a high of 76°F. Tonight, it'll be mainly clear early, then areas of low clouds forming with a low of 56°F. I need to present this info clearly. Start with the current weather, mention the high, then the tonight forecast with the low. Keep it friendly and concise. Maybe add a note about the temperature change in the evening.
</think>
Right now in San Francisco, the clouds are clearing up for sunshine with a high of **76°F**. Tonight, skies will be mostly clear early, followed by some low clouds rolling in, and temperatures dropping to a low of **56°F**. A pleasant day with mild evening cooling! 🌤️
- 推理模型为Meta-Llama-3.1-8B-Instruct
Function called: get_weather
Arguments: {"location": "San Francisco, CA", "unit": "fahrenheit"}
Getting the weather for San Francisco, CA in fahrenheit...
Final output :
The current weather in San Francisco is mostly sunny with a high temperature of 76°F (24°C). Tonight, it will be mainly clear in the early hours, but low clouds will start to form later. The low temperature tonight is expected to be around 56°F (13°C).