[DOCS][Swarm models]

pull/584/head
Your Name 4 months ago
parent 4a9eb771f7
commit 63a5ad4fd5

@ -1,9 +1,6 @@
# Swarm Models # Swarm Models
## Install
```bash ```bash
$ pip3 install -U swarm-models $ pip3 install -U swarm-models
``` ```
@ -16,7 +13,7 @@ Welcome to the documentation for the llm section of the swarms package, designed
3. [Google PaLM](#google-palm) 3. [Google PaLM](#google-palm)
4. [Anthropic](#anthropic) 4. [Anthropic](#anthropic)
### 1. OpenAI (swarms.agents.models.OpenAI) ### 1. OpenAI (swarm_models.OpenAI)
The OpenAI class provides an interface to interact with OpenAI's language models. It allows both synchronous and asynchronous interactions. The OpenAI class provides an interface to interact with OpenAI's language models. It allows both synchronous and asynchronous interactions.
@ -40,7 +37,7 @@ OpenAI(api_key: str, system: str = None, console: bool = True, model: str = None
**Methods:** **Methods:**
- `generate(message: str, **kwargs) -> str`: Generate a response using the OpenAI model. - `run(message: str, **kwargs) -> str`: Generate a response using the OpenAI model.
- `generate_async(message: str, **kwargs) -> str`: Generate a response asynchronously. - `generate_async(message: str, **kwargs) -> str`: Generate a response asynchronously.
@ -56,7 +53,7 @@ from swarm_models import OpenAI
chat = OpenAI(api_key="YOUR_OPENAI_API_KEY") chat = OpenAI(api_key="YOUR_OPENAI_API_KEY")
response = chat.generate("Hello, how can I assist you?") response = chat.run("Hello, how can I assist you?")
print(response) print(response)
ids = ["id1", "id2", "id3"] ids = ["id1", "id2", "id3"]
@ -64,7 +61,7 @@ async_responses = asyncio.run(chat.ask_multiple(ids, "How is {id}?"))
print(async_responses) print(async_responses)
``` ```
### 2. HuggingFace (swarms.agents.models.HuggingFaceLLM) ### 2. HuggingFace (swarm_models.HuggingFaceLLM)
The HuggingFaceLLM class allows interaction with language models from Hugging Face. The HuggingFaceLLM class allows interaction with language models from Hugging Face.
@ -87,7 +84,7 @@ HuggingFaceLLM(model_id: str, device: str = None, max_length: int = 20, quantize
**Methods:** **Methods:**
- `generate(prompt_text: str, max_length: int = None) -> str`: Generate text based on a prompt. - `run(prompt_text: str, max_length: int = None) -> str`: Generate text based on a prompt.
**Usage Example:** **Usage Example:**
```python ```python
@ -97,54 +94,11 @@ model_id = "gpt2"
hugging_face_model = HuggingFaceLLM(model_id=model_id) hugging_face_model = HuggingFaceLLM(model_id=model_id)
prompt = "Once upon a time" prompt = "Once upon a time"
generated_text = hugging_face_model.generate(prompt) generated_text = hugging_face_model.run(prompt)
print(generated_text) print(generated_text)
``` ```
### 3. Google PaLM (swarms.agents.models.GooglePalm) ### 3. Anthropic (swarm_models.Anthropic)
The GooglePalm class provides an interface for Google's PaLM Chat API.
**Constructor:**
```python
GooglePalm(model_name: str = "models/chat-bison-001", google_api_key: str = None, temperature: float = None, top_p: float = None, top_k: int = None, n: int = 1)
```
**Attributes:**
- `model_name` (str): Name of the Google PaLM model.
- `google_api_key` (str, optional): Google API key.
- `temperature` (float, optional): Temperature for text generation.
- `top_p` (float, optional): Top-p sampling value.
- `top_k` (int, optional): Top-k sampling value.
- `n` (int, default=1): Number of candidate completions.
**Methods:**
- `generate(messages: List[Dict[str, Any]], stop: List[str] = None, **kwargs) -> Dict[str, Any]`: Generate text based on a list of messages.
- `__call__(messages: List[Dict[str, Any]], stop: List[str] = None, **kwargs) -> Dict[str, Any]`: Generate text using the call syntax.
**Usage Example:**
```python
from swarm_models import GooglePalm
google_palm = GooglePalm()
messages = [
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Tell me a joke"},
]
response = google_palm.generate(messages)
print(response["choices"][0]["text"])
```
### 4. Anthropic (swarms.agents.models.Anthropic)
The Anthropic class enables interaction with Anthropic's large language models. The Anthropic class enables interaction with Anthropic's large language models.
@ -171,7 +125,7 @@ Anthropic(model: str = "claude-2", max_tokens_to_sample: int = 256, temperature:
**Methods:** **Methods:**
- `generate(prompt: str, stop: List[str] = None) -> str`: Generate text based on a prompt. - `run(prompt: str, stop: List[str] = None) -> str`: Generate text based on a prompt.
**Usage Example:** **Usage Example:**
```python ```python
@ -179,7 +133,7 @@ from swarm_models import Anthropic
anthropic = Anthropic() anthropic = Anthropic()
prompt = "Once upon a time" prompt = "Once upon a time"
generated_text = anthropic.generate(prompt) generated_text = anthropic.run(prompt)
print(generated_text) print(generated_text)
``` ```

Loading…
Cancel
Save