fix(agent): resolve math agent error by fixing streaming response handling and server URL

pull/819/head
DP37 3 months ago committed by ascender1729
parent b4fc45e1a0
commit e34b4462c3

@ -0,0 +1,47 @@
Math Agent System Initialized
Available operations:2025-04-20 11:05:16 | WARNING | swarms.structs.agent:llm_handling:646 - Model name is not provided, using gpt-4o-mini. You can configure any model from litellm if desired.
Math Agent: add, multiply, divide
Enter your query (or 'exit' to quit): add 2 and 4
╭─────────────────────────────────────────────────────── Agent Name: Math Agent [Max Loops: 1] ───────────────────────────────────────────────────────╮
│ Math Agent: <litellm.litellm_core_utils.streaming_handler.CustomStreamWrapper object at 0x7fb9d6ddcdf0> │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
2025-04-20 11:05:28 | ERROR | swarms.structs.agent:mcp_execution_flow:2807 - MCP flow failed: Expecting value: line 1 column 1 (char 0)
╭─────────────────────────────────────────────── Agent Name: Math Agent - Tool Executor [Max Loops: 1] ───────────────────────────────────────────────╮
│ Math Agent - Tool Executor: [MCP-error] Expecting value: line 1 column 1 (char 0) │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭────────────────────────────────────────────── Agent Name: Math Agent - Agent Analysis [Max Loops: 1] ───────────────────────────────────────────────╮
│ Math Agent - Agent Analysis: <litellm.litellm_core_utils.streaming_handler.CustomStreamWrapper object at 0x7fb9d6e14d30> │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
Math Agent Response: System: : Your Name: Math Agent
Your Description: Specialized agent for mathematical computations
You are a specialized math agent that can perform calculations by calling external math service APIs.
Key responsibilities:
1. Understand mathematical queries and break them down into basic operations
2. Use available math tools (add, multiply, divide) appropriately
3. Provide clear explanations of calculations
4. Handle errors gracefully if operations fail
Remember to use the available MCP tools for calculations rather than doing them directly.
Human:: add 2 and 4
Math Agent: <litellm.litellm_core_utils.streaming_handler.CustomStreamWrapper object at 0x7fb9d6ddcdf0>
Tool Executor: [MCP-error] Expecting value: line 1 column 1 (char 0)
Math Agent: <litellm.litellm_core_utils.streaming_handler.CustomStreamWrapper object at 0x7fb9d6e14d30>
Enter your query (or 'exit' to quit):

@ -0,0 +1,123 @@
### What happened?
1. **The LLM is returning a streaming wrapper instead of text**
Because `streaming_on=True`, LiteLLM gives you a `CustomStreamWrapper` object.
Your helper `parse_llm_output()` doesnt know how to turn that object into a string, so the
agent stores the wrapper itself.
2. **`mcp_execution_flow()` received an empty string**
When the agent tried to forward the toolcall to your FastMCP server it called
`json.loads(<emptystring>)`, which raised
*“Expecting value: line 1 column 1 (char 0)”*.
That means the HTTP call to `http://0.0.0.0:8000/mcp` returned **no body**
usually because the path is wrong or the server crashed.
---
### Fast MCP mockserver: correct URL
`FastMCP.run(transport="sse", port=8000)` serves SSE on
`http://<host>:8000/stream` **and** accepts JSONRPC over POST on the root
`http://<host>:8000/`.
So point the client at the root (no `/mcp` suffix):
```python
math_server = MCPServerSseParams(
url="http://0.0.0.0:8000", # <── just the origin
headers={"Content-Type": "application/json"},
timeout=5.0,
sse_read_timeout=30.0,
)
```
---
### Stop streaming while you debug
```python
math_agent = Agent(
agent_name="Math Agent",
agent_description="Specialized agent for mathematical computations",
system_prompt=MATH_AGENT_PROMPT,
max_loops=1,
mcp_servers=[math_server],
streaming_on=False, # <── turn off for now
)
```
(You can turn it back on when everything works and you extend
`parse_llm_output()` to unwrap streams.)
---
### Make sure the LLM actually *produces* a tool call
LLMs wont invent a JSONRPC envelope unless you ask them.
Add a very small system hint that mirrors the MCP schema, e.g.
```python
TOOL_CALL_INSTRUCTION = """
When you want to use a math tool, reply with a JSON object only:
{"tool_name": "<add|multiply|divide>", "a": <int>, "b": <int>}
"""
math_agent.short_memory.add(role="system", content=TOOL_CALL_INSTRUCTION)
```
Now “**add 2 and 4**” should yield
```json
{"tool_name":"add","a":2,"b":4}
```
which your `mcp_execution_flow()` can forward.
---
### Fix the emptybody crash guard (optional)
Inside `mcp_execution_flow()` (or `mcp_flow`) you can protect yourself:
```python
resp = client.post(...)
if not resp.text:
raise RuntimeError("MCP server returned empty body")
```
That turns an unhelpful JSON decode error into a clear MCPconnection error.
---
### Quick test commands
```bash
# 1. Start mock server in one shell
python mock_math_server.py # “Starting Mock Math Server on port 8000…”
# 2. In another shell, make sure it answers
curl -X POST http://0.0.0.0:8000 \
-H "Content-Type: application/json" \
-d '{"tool_name":"add","a":2,"b":4}'
# → {"result":6}
# 3. Run your client
python mcp_client.py
```
If you now type `add 2 and 4`, the agent should respond **6** (with your
explanation string) instead of the streamobject dump.
---
#### Next steps
* Reenable `streaming_on=True` and add handling of LiteLLMs stream wrapper in
`parse_llm_output()`.
* Return the math result to the user in natural language (e.g. “2 + 4 = 6”).
* Add error handling in the mock server (e.g. dividebyzero already covered).
That should give you a clean smoketest of **Fast MCP + Swarms Agent** without
the functioncalling “craziness”.
Loading…
Cancel
Save