fix: Handle tool calls in JSON array format
All checks were successful
BotServer CI / build (push) Successful in 9m25s

Fixed issue where LLM tool calls returned as JSON arrays were not being
detected and were displayed as raw JSON in the chat instead of being executed.

The parse_tool_call method now handles:
- Single tool call objects
- Arrays of tool calls (OpenAI standard format)

This prevents tool call JSON from appearing in the chat window and ensures
tools are executed properly.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Rodrigo Rodriguez 2026-02-16 00:19:03 +00:00
parent b92ef7c034
commit 4ca7e5da40

View file

@ -78,10 +78,18 @@ impl ToolExecutor {
}
/// Parse a tool call JSON from any LLM provider
/// Handles OpenAI, GLM, Claude formats
/// Handles both single objects and arrays of tool calls
pub fn parse_tool_call(chunk: &str) -> Option<ParsedToolCall> {
// Try to parse as JSON
let json: Value = serde_json::from_str(chunk).ok()?;
// Handle array of tool calls (common OpenAI format)
if let Some(arr) = json.as_array() {
if let Some(first_tool) = arr.first() {
return Self::extract_tool_call(first_tool);
}
}
// Check if this is a tool_call type (from GLM wrapper)
if let Some(tool_type) = json.get("type").and_then(|t| t.as_str()) {
if tool_type == "tool_call" {