Update README.md

pull/502/head
Kye Gomez 1 year ago committed by GitHub
parent b4553cbdc5
commit f87110c401
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -1444,8 +1444,7 @@ Coming soon...
This is an implementation from the paper: "Mixture-of-Agents Enhances Large Language Model Capabilities" by together.ai, it achieves SOTA on AlpacaEval 2.0, MT-Bench and FLASK, surpassing GPT-4 Omni. Great for tasks that need to be parallelized and then sequentially fed into another loop This is an implementation from the paper: "Mixture-of-Agents Enhances Large Language Model Capabilities" by together.ai, it achieves SOTA on AlpacaEval 2.0, MT-Bench and FLASK, surpassing GPT-4 Omni. Great for tasks that need to be parallelized and then sequentially fed into another loop
```python ```python
from swarms import Agent, OpenAIChat from swarms import Agent, OpenAIChat, MixtureOfAgents
from swarms.structs.mixture_of_agents import MixtureOfAgents
# Initialize the director agent # Initialize the director agent
director = Agent( director = Agent(

Loading…
Cancel
Save