feat: 增强工具调用代理功能,支持多工具调用和消息历史转换

主要改进:
- 新增 convert_tool_calls_to_content 函数,将消息历史中的 tool_calls 转换为 LLM 可理解的 XML 格式
- 修复 response_parser 支持同时解析多个 tool_calls
- 优化响应解析逻辑,支持 content 和 tool_calls 同时存在
- 添加完整的测试覆盖,包括多工具调用、消息转换和混合响应

技术细节:
- services.py: 实现工具调用历史到 content 的转换
- response_parser.py: 使用非贪婪匹配支持多个 tool_calls 解析
- main.py: 集成消息转换功能,确保消息历史正确传递给 LLM

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Vertex-AI-Step-Builder
2025-12-31 13:33:25 +00:00
parent f7508d915b
commit 5c2904e010
6 changed files with 624 additions and 31 deletions

View File

@@ -8,7 +8,7 @@ from fastapi import FastAPI, HTTPException, Depends, Request
from starlette.responses import StreamingResponse
from .models import IncomingRequest, ProxyResponse
from .services import process_chat_request, stream_llm_api, inject_tools_into_prompt, parse_llm_response_from_content, _parse_sse_data
from .services import process_chat_request, stream_llm_api, inject_tools_into_prompt, parse_llm_response_from_content, _parse_sse_data, convert_tool_calls_to_content
from .core.config import get_settings, Settings
from .database import init_db, log_request, update_request_log
@@ -87,8 +87,13 @@ async def chat_completions(
raise HTTPException(status_code=500, detail="LLM API Key or URL is not configured.")
messages_to_llm = request_obj.messages
# Convert assistant messages with tool_calls to content format
messages_to_llm = convert_tool_calls_to_content(messages_to_llm)
logger.info(f"Converted tool calls to content format for log ID: {log_id}")
if request_obj.tools:
messages_to_llm = inject_tools_into_prompt(request_obj.messages, request_obj.tools)
messages_to_llm = inject_tools_into_prompt(messages_to_llm, request_obj.tools)
# Handle streaming request
if request_obj.stream: