You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
swarms/attached_assets/Pasted-The-log-tells-us-two...

92 lines
3.5 KiB

This file contains invisible Unicode characters!

This file contains invisible Unicode characters that may be processed differently from what appears below. If your use case is intentional and legitimate, you can safely ignore this warning. Use the Escape button to reveal hidden characters.

This file contains ambiguous Unicode characters that may be confused with others in your current locale. If your use case is intentional and legitimate, you can safely ignore this warning. Use the Escape button to highlight these characters.

The log tells us two separate things are still wrong:
| Symptom | Root cause | 1line fix |
|---------|------------|------------|
| `RuntimeWarning: coroutine 'batch_mcp_flow' was never awaited` | `batch_mcp_flow()` is **async**, but you call it like a plain function. | `out = await batch_mcp_flow(...)` *or* wrap the call in `asyncio.run(…)` if you want to stay sync. |
| `[MCPerror] Expecting value: line 1 column 1 (char 0)` | The FastMCP mock server replied, but your client tried to `json.loads()` an **empty body** (usually means the HTTP endpoint or headers are wrong). | Point the URL at the server **root** (`http://0.0.0.0:8000`) and send plain JSONRPC. |
Below is a minimal patch that removes both errors without touching the rest of your codebase.
---
## 1   make `mcp_execution_flow()` await the coroutine
```python
# ── swarms/structs/agent.py ─────────────────────────────────────────
import asyncio # add this at the top if its not there
# …
def mcp_execution_flow(self, payload: dict):
"""
Forward the toolcall dict to every MCP server in self.mcp_servers.
Returns the first nonerror response as a string.
"""
try:
# batch_mcp_flow is async ➜ run it synchronously
result = asyncio.run(
batch_mcp_flow(self.mcp_servers, payload)
)
return any_to_str(result)
except Exception as err:
logger.error(f"MCP flow failed: {err}")
return f"[MCP-error] {err}"
```
*(If you prefer your whole agent to stay async, just make
`mcp_execution_flow` itself `async def` and `await batch_mcp_flow` —
then call it with `await` from the `_run` loop.)*
---
## 2   use the correct FastMCP endpoint
In the client that spins up the **math agent**:
```python
math_server = MCPServerSseParams(
url="http://0.0.0.0:8000", # ← root! no “/mcp”
headers={"Content-Type": "application/json"},
timeout=5.0,
sse_read_timeout=30.0,
)
```
 FastMCPs `run(transport="sse", port=8000)` already exposes both
the SSE stream and the JSONRPC POST endpoint on that root URL.
Adding `/mcp` makes the request hit a 404, so the body is empty — thats
exactly what the JSON decoder complained about.
---
## 3   (optional) turn streaming off until everything works
```python
math_agent = Agent(
# …
streaming_on=False # ← easier to debug; turn back on later
)
```
With streaming disabled, `LiteLLM` returns plain strings, so your
`parse_llm_output()` method wont be handed a
`CustomStreamWrapper` object any more.
---
### Quick test matrix
| Action | Expected result |
|--------|-----------------|
| `curl -X POST http://0.0.0.0:8000 -d '{"tool_name":"add","a":2,"b":3}' -H 'Content-Type: application/json'` | `{"result":5}` |
| Run `mock_math_server.py` | “Starting Mock Math Server on port 8000…” |
| Run `mcp_client.py`, type `add 2 and 3` | Agent replies something like “2 + 3 = 5”. No coroutine warning. |
As soon as the roundtrip works once, you can:
* reenable `streaming_on=True` and teach `parse_llm_output()` to turn a
`CustomStreamWrapper` into text (`"".join(token.choices[0].delta.content for token in wrapper)`);
* point the agent at your real MCP servers instead of the mock one.
Happy hacking!