|
|
The log tells us two separate things are still wrong:
|
|
|
|
|
|
| Symptom | Root cause | 1‑line fix |
|
|
|
|---------|------------|------------|
|
|
|
| `RuntimeWarning: coroutine 'batch_mcp_flow' was never awaited` | `batch_mcp_flow()` is **async**, but you call it like a plain function. | `out = await batch_mcp_flow(...)` *or* wrap the call in `asyncio.run(…)` if you want to stay sync. |
|
|
|
| `[MCP‑error] Expecting value: line 1 column 1 (char 0)` | The Fast‑MCP mock server replied, but your client tried to `json.loads()` an **empty body** (usually means the HTTP endpoint or headers are wrong). | Point the URL at the server **root** (`http://0.0.0.0:8000`) and send plain JSON‑RPC. |
|
|
|
|
|
|
Below is a minimal patch that removes both errors without touching the rest of your code‑base.
|
|
|
|
|
|
---
|
|
|
|
|
|
## 1 ️⃣ — make `mcp_execution_flow()` await the coroutine
|
|
|
|
|
|
```python
|
|
|
# ── swarms/structs/agent.py ─────────────────────────────────────────
|
|
|
|
|
|
import asyncio # add this at the top if it’s not there
|
|
|
|
|
|
# …
|
|
|
|
|
|
def mcp_execution_flow(self, payload: dict):
|
|
|
"""
|
|
|
Forward the tool‑call dict to every MCP server in self.mcp_servers.
|
|
|
Returns the first non‑error response as a string.
|
|
|
"""
|
|
|
try:
|
|
|
# batch_mcp_flow is async ➜ run it synchronously
|
|
|
result = asyncio.run(
|
|
|
batch_mcp_flow(self.mcp_servers, payload)
|
|
|
)
|
|
|
return any_to_str(result)
|
|
|
except Exception as err:
|
|
|
logger.error(f"MCP flow failed: {err}")
|
|
|
return f"[MCP-error] {err}"
|
|
|
```
|
|
|
|
|
|
*(If you prefer your whole agent to stay async, just make
|
|
|
`mcp_execution_flow` itself `async def` and `await batch_mcp_flow` —
|
|
|
then call it with `await` from the `_run` loop.)*
|
|
|
|
|
|
---
|
|
|
|
|
|
## 2 ️⃣ — use the correct Fast‑MCP endpoint
|
|
|
|
|
|
In the client that spins up the **math agent**:
|
|
|
|
|
|
```python
|
|
|
math_server = MCPServerSseParams(
|
|
|
url="http://0.0.0.0:8000", # ← root! no “/mcp”
|
|
|
headers={"Content-Type": "application/json"},
|
|
|
timeout=5.0,
|
|
|
sse_read_timeout=30.0,
|
|
|
)
|
|
|
```
|
|
|
|
|
|
⚠️ Fast‑MCP’s `run(transport="sse", port=8000)` already exposes both
|
|
|
the SSE stream and the JSON‑RPC POST endpoint on that root URL.
|
|
|
Adding `/mcp` makes the request hit a 404, so the body is empty — that’s
|
|
|
exactly what the JSON decoder complained about.
|
|
|
|
|
|
---
|
|
|
|
|
|
## 3 ️⃣ — (optional) turn streaming off until everything works
|
|
|
|
|
|
```python
|
|
|
math_agent = Agent(
|
|
|
# …
|
|
|
streaming_on=False # ← easier to debug; turn back on later
|
|
|
)
|
|
|
```
|
|
|
|
|
|
With streaming disabled, `LiteLLM` returns plain strings, so your
|
|
|
`parse_llm_output()` method won’t be handed a
|
|
|
`CustomStreamWrapper` object any more.
|
|
|
|
|
|
---
|
|
|
|
|
|
### Quick test matrix
|
|
|
|
|
|
| Action | Expected result |
|
|
|
|--------|-----------------|
|
|
|
| `curl -X POST http://0.0.0.0:8000 -d '{"tool_name":"add","a":2,"b":3}' -H 'Content-Type: application/json'` | `{"result":5}` |
|
|
|
| Run `mock_math_server.py` | “Starting Mock Math Server on port 8000…” |
|
|
|
| Run `mcp_client.py`, type `add 2 and 3` | Agent replies something like “2 + 3 = 5”. No coroutine warning. |
|
|
|
|
|
|
As soon as the round‑trip works once, you can:
|
|
|
|
|
|
* re‑enable `streaming_on=True` and teach `parse_llm_output()` to turn a
|
|
|
`CustomStreamWrapper` into text (`"".join(token.choices[0].delta.content for token in wrapper)`);
|
|
|
* point the agent at your real MCP servers instead of the mock one.
|
|
|
|
|
|
Happy hacking! |