updated readme, fix paths for imports after moving to playground demos

pull/570/head
Richard Anthony Hein 8 months ago
parent b43e940890
commit b439fa0942

@ -18,7 +18,18 @@ In theory, any OpenAI compatible LLM endpoint is supported via the OpenAIChatLLM
### Quickstart
* Start vLLM using Docker container by running the [dockerRunVllm](./server/dockerRunVllm.sh). Adjust the script to select your desired model and set the HUGGING_FACE_HUB_TOKEN.
* Start vLLM using GPUs using Docker container by running the [dockerRunVllm](./server/dockerRunVllm.sh). Adjust the script to select your desired model and set the HUGGING_FACE_HUB_TOKEN.
** For CPU support (not recommended for vLLM), build and run it in docker using this [Dockerfile](./Dockerfile).
```bash
cd <root>/swarms/playground/demos/chatbot
docker build -t llm-serving:vllm-cpu -f ~/vllm/Dockerfile.cpu .
docker run --rm --env "HF_TOKEN=<your hugging face token>" \
--ipc=host \
-p 8000:8000 \
llm-serving:vllm-cpu \
--model NousResearch/Hermes-3-Llama-3.1-8B
```
* Start the Chatbot API Server with the following shell command:

@ -1,4 +1,4 @@
from swarms.server.vector_store import VectorStorage
from playground.demos.chatbot.server.vector_store import VectorStorage
__all__ = [
"VectorStorage",

@ -27,9 +27,9 @@ from swarms.prompts.conversational_RAG import (
E_SYS,
QA_PROMPT_TEMPLATE_STR,
)
from swarms.server.responses import StreamingResponse
from swarms.server.server_models import ChatRequest
from swarms.server.vector_store import VectorStorage
from playground.demos.chatbot.server.responses import StreamingResponse
from playground.demos.chatbot.server.server_models import ChatRequest
from playground.demos.chatbot.server.vector_store import VectorStorage
from swarms.models.popular_llms import OpenAIChatLLM
# Explicitly specify the path to the .env file

@ -15,7 +15,7 @@ from langchain.storage import LocalFileStore
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain_chroma import Chroma
from swarms.server.async_parent_document_retriever import \
from playground.demos.chatbot.server.async_parent_document_retriever import \
AsyncParentDocumentRetriever
STORE_TYPE = "local" # "redis" or "local"

Loading…
Cancel
Save