worker example

Former-commit-id: 916e66d03f
discord-bot-framework
Kye 1 year ago
parent e894530800
commit 9faf2025f0

@ -1,23 +0,0 @@
```python
from swarms.artifacts import BaseArtifact
from swarms.drivers import LocalVectorStoreDriver
from swarms.loaders import WebLoader
vector_store = LocalVectorStoreDriver()
[
vector_store.upsert_text_artifact(a, namespace="swarms")
for a in WebLoader(max_tokens=100).load("https://www.swarms.ai")
]
results = vector_store.query(
"creativity",
count=3,
namespace="swarms"
)
values = [BaseArtifact.from_json(r.meta["artifact"]).value for r in results]
print("\n\n".join(values))
```

@ -53,14 +53,33 @@ Voila! Youre now ready to summon your Worker.
Heres a simple way to invoke the Worker and give it a task:
```python
from swarms.models import OpenAIChat
from swarms import Worker
node = Worker(ai_name="Optimus Prime")
llm = OpenAIChat(
#enter your api key
openai_api_key="",
temperature=0.5,
)
node = Worker(
llm=llm,
ai_name="Optimus Prime",
openai_api_key="",
ai_role="Worker in a swarm",
external_tools=None,
human_in_the_loop=False,
temperature=0.5,
)
task = "What were the winning boston marathon times for the past 5 years (ending in 2022)? Generate a table of the year, name, country of origin, and times."
response = node.run(task)
print(response)
```
The result? An agent with elegantly integrated tools and long term memories
---

@ -49,28 +49,181 @@ Makes the Worker class callable. When an instance of the class is called, it wil
### **Example 1**: Basic usage with default parameters:
```python
from swarms.models import OpenAIChat
from swarms import Worker
worker = Worker(model_name="gpt-4", openai_api_key="YOUR_API_KEY")
result = worker.run("Summarize the document.")
llm = OpenAIChat(
#enter your api key
openai_api_key="",
temperature=0.5,
)
node = Worker(
llm=llm,
ai_name="Optimus Prime",
openai_api_key="",
ai_role="Worker in a swarm",
external_tools=None,
human_in_the_loop=False,
temperature=0.5,
)
task = "What were the winning boston marathon times for the past 5 years (ending in 2022)? Generate a table of the year, name, country of origin, and times."
response = node.run(task)
print(response)
```
### **Example 2**: Usage with custom tools:
```python
from swarms import Worker, MyCustomTool
import os
import interpreter
from swarms.agents.hf_agents import HFAgent
from swarms.agents.omni_modal_agent import OmniModalAgent
from swarms.models import OpenAIChat
from swarms.tools.autogpt import tool
from swarms.workers import Worker
# Initialize API Key
api_key = ""
# Initialize the language model,
# This model can be swapped out with Anthropic, ETC, Huggingface Models like Mistral, ETC
llm = OpenAIChat(
openai_api_key=api_key,
temperature=0.5,
)
# wrap a function with the tool decorator to make it a tool, then add docstrings for tool documentation
@tool
def hf_agent(task: str = None):
"""
An tool that uses an openai model to call and respond to a task by search for a model on huggingface
It first downloads the model then uses it.
Rules: Don't call this model for simple tasks like generating a summary, only call this tool for multi modal tasks like generating images, videos, speech, etc
"""
agent = HFAgent(model="text-davinci-003", api_key=api_key)
response = agent.run(task, text="¡Este es un API muy agradable!")
return response
# wrap a function with the tool decorator to make it a tool
@tool
def omni_agent(task: str = None):
"""
An tool that uses an openai Model to utilize and call huggingface models and guide them to perform a task.
Rules: Don't call this model for simple tasks like generating a summary, only call this tool for multi modal tasks like generating images, videos, speech
The following tasks are what this tool should be used for:
Tasks omni agent is good for:
--------------
document-question-answering
image-captioning
image-question-answering
image-segmentation
speech-to-text
summarization
text-classification
text-question-answering
translation
huggingface-tools/text-to-image
huggingface-tools/text-to-video
text-to-speech
huggingface-tools/text-download
huggingface-tools/image-transformation
"""
agent = OmniModalAgent(llm)
response = agent.run(task)
return response
# Code Interpreter
@tool
def compile(task: str):
"""
Open Interpreter lets LLMs run code (Python, Javascript, Shell, and more) locally.
You can chat with Open Interpreter through a ChatGPT-like interface in your terminal
by running $ interpreter after installing.
This provides a natural-language interface to your computer's general-purpose capabilities:
Create and edit photos, videos, PDFs, etc.
Control a Chrome browser to perform research
Plot, clean, and analyze large datasets
...etc.
⚠️ Note: You'll be asked to approve code before it's run.
Rules: Only use when given to generate code or an application of some kind
"""
task = interpreter.chat(task, return_messages=True)
interpreter.chat()
interpreter.reset(task)
os.environ["INTERPRETER_CLI_AUTO_RUN"] = True
os.environ["INTERPRETER_CLI_FAST_MODE"] = True
os.environ["INTERPRETER_CLI_DEBUG"] = True
# Append tools to an list
tools = [hf_agent, omni_agent, compile]
# Initialize a single Worker node with previously defined tools in addition to it's
# predefined tools
node = Worker(
llm=llm,
ai_name="Optimus Prime",
openai_api_key=api_key,
ai_role="Worker in a swarm",
external_tools=tools,
human_in_the_loop=False,
temperature=0.5,
)
# Specify task
task = "What were the winning boston marathon times for the past 5 years (ending in 2022)? Generate a table of the year, name, country of origin, and times."
# Run the node on the task
response = node.run(task)
# Print the response
print(response)
worker = Worker(model_name="gpt-4", openai_api_key="YOUR_API_KEY", external_tools=[MyCustomTool()])
result = worker.run("Perform a custom operation on the document.")
```
### **Example 3**: Usage with human in the loop:
```python
from swarms.models import OpenAIChat
from swarms import Worker
worker = Worker(model_name="gpt-4", openai_api_key="YOUR_API_KEY", human_in_the_loop=True)
result = worker.run("Translate this complex document, and ask for help if needed.")
llm = OpenAIChat(
#enter your api key
openai_api_key="",
temperature=0.5,
)
node = Worker(
llm=llm,
ai_name="Optimus Prime",
openai_api_key="",
ai_role="Worker in a swarm",
external_tools=None,
human_in_the_loop=True,
temperature=0.5,
)
task = "What were the winning boston marathon times for the past 5 years (ending in 2022)? Generate a table of the year, name, country of origin, and times."
response = node.run(task)
print(response)
```
## **Mathematical Description**:

@ -101,7 +101,7 @@ nav:
- OmniAgent: "examples/omni_agent.md"
- Worker:
- Basic: "examples/worker.md"
- StackedWorker: "examplses"
- StackedWorker: "examples/stacked_worker.md"
- Applications:
- CustomerSupport:

Loading…
Cancel
Save