`class ScalableGroupChat:`

pull/58/head
Kye 1 year ago
parent 2c27f5e796
commit e0f110fdef

@ -15,91 +15,27 @@ class TaskStatus(Enum):
class ScalableGroupChat:
"""
The Orchestrator takes in an agent, worker, or boss as input
then handles all the logic for
- task creation,
- task assignment,
- and task compeletion.
This is a class to enable scalable groupchat like a telegram, it takes an Worker as an input
and handles all the logic to enable multi-agent collaboration at massive scale.
And, the communication for millions of agents to chat with eachother through
a vector database that each agent has access to chat with.
Worker -> ScalableGroupChat(Worker * 10)
-> every response is embedded and placed in chroma
-> every response is then retrieved and sent to the worker
-> every worker is then updated with the new response
Each LLM agent chats with the orchestrator through a dedicated
communication layer. The orchestrator assigns tasks to each LLM agent,
which the agents then complete and return.
This setup allows for a high degree of flexibility, scalability, and robustness.
In the context of swarm LLMs, one could consider an **Omni-Vector Embedding Database
for communication. This database could store and manage
the high-dimensional vectors produced by each LLM agent.
Strengths: This approach would allow for similarity-based lookup and matching of
LLM-generated vectors, which can be particularly useful for tasks that involve finding similar outputs or recognizing patterns.
Weaknesses: An Omni-Vector Embedding Database might add complexity to the system in terms of setup and maintenance.
It might also require significant computational resources,
depending on the volume of data being handled and the complexity of the vectors.
The handling and transmission of high-dimensional vectors could also pose challenges
in terms of network load.
# Orchestrator
* Takes in an agent class with vector store,
then handles all the communication and scales
up a swarm with number of agents and handles task assignment and task completion
from swarms import OpenAI, Orchestrator, Swarm
orchestrated = Orchestrate(OpenAI, nodes=40) #handles all the task assignment and allocation and agent communication using a vectorstore as a universal communication layer and also handlles the task completion logic
Objective = "Make a business website for a marketing consultancy"
Swarms = Swarms(orchestrated, auto=True, Objective))
```
In terms of architecture, the swarm might look something like this:
```
(Orchestrator)
/ \
Tools + Vector DB -- (LLM Agent)---(Communication Layer) (Communication Layer)---(LLM Agent)-- Tools + Vector DB
/ | | \
(Task Assignment) (Task Completion) (Task Assignment) (Task Completion)
###Usage
```
from swarms import Orchestrator
# Instantiate the Orchestrator with 10 agents
orchestrator = Orchestrator(llm, agent_list=[llm]*10, task_queue=[])
# Add tasks to the Orchestrator
tasks = [{"content": f"Write a short story about a {animal}."} for animal in ["cat", "dog", "bird", "fish", "lion", "tiger", "elephant", "giraffe", "monkey", "zebra"]]
orchestrator.assign_tasks(tasks)
# Run the Orchestrator
orchestrator.run()
# Retrieve the results
for task in tasks:
print(orchestrator.retrieve_result(id(task)))
```
"""
def __init__(
self,
worker_count: int = 5,
collection_name: str = "swarm",
api_key: str = None,
model_name: str = None,
worker = None
):
self.workers = []
self.worker_count = worker_count
# Create a list of Worker instances with unique names
for i in range(worker_count):
self.workers.append(Worker(openai_api_key="", ai_name=f"Worker-{i}"))
self.workers.append(Worker(openai_api_key=api_key, ai_name=f"Worker-{i}"))
def embed(self, input, api_key, model_name):
openai = embedding_functions.OpenAIEmbeddingFunction(

Loading…
Cancel
Save