You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
swarms/docs/examples/flow.md

12 KiB

Walkthrough Guide: Getting Started with Swarms Module's Flow Feature

Introduction

Welcome to the walkthrough guide for beginners on using the "Flow" feature within the Swarms module. This guide is designed to help you understand and utilize the capabilities of the Flow class for seamless interactions with AI language models.

Target Audience:

  • This guide is primarily intended for beginners who want to learn how to use the Flow feature in the Swarms module to interact with AI language models effectively.

Table of Contents

1. Understanding the Flow Feature

   - 1.1 What is the Flow Feature?

   - 1.2 Key Concepts

2. Setting Up the Environment

   - 2.1 Prerequisites

   - 2.2 Installing Required Libraries

   - 2.3 Importing Necessary Modules

3. Creating a Flow Instance

   - 3.1 Importing the Required Modules

   - 3.2 Initializing the Language Model

   - 3.3 Creating a Flow Instance

4. Running a Flow

   - 4.1 Defining the Task

   - 4.2 Running the Flow

   - 4.3 Interacting with the AI

   - 4.4 Dynamic Temperature Handling

5. Customizing Flow Behavior

   - 5.1 Stopping Conditions

   - 5.2 Retry Mechanism

   - 5.3 Loop Interval

   - 5.4 Interactive Mode

6. Saving and Loading Flows

   - 6.1 Saving a Flow

   - 6.2 Loading a Saved Flow

7. Analyzing Feedback and Undoing Actions

   - 7.1 Providing Feedback

   - 7.2 Undoing the Last Action

   - 7.3 Response Filtering

8. Advanced Features

   - 8.1 Streamed Generation

   - 8.2 Real-time Token Generation

9. Best Practices

   - 9.1 Conciseness and Clarity

   - 9.2 Active Voice

   - 9.3 Highlighting Important Points

   - 9.4 Consistent Style

10. Conclusion


1. Understanding the Flow Feature

1.1 What is the Flow Feature?

The Flow feature is a powerful component of the Swarms framework that allows developers to create a sequential, conversational interaction with AI language models. It enables developers to build multi-step conversations, generate long-form content, and perform complex tasks using AI. The Flow class provides autonomy to language models, enabling them to generate responses in a structured manner.

1.2 Key Concepts

Before diving into the practical aspects, let's clarify some key concepts related to the Flow feature:

  • Flow: A Flow is an instance of the Flow class that represents an ongoing interaction with an AI language model. It consists of a series of steps and responses.

  • Stopping Condition: A stopping condition is a criterion that, when met, allows the Flow to stop generating responses. This can be user-defined and can depend on the content of the responses.

  • Loop Interval: The loop interval specifies the time delay between consecutive interactions with the AI model.

  • Retry Mechanism: In case of errors or failures during AI model interactions, the Flow can be configured to make multiple retry attempts with a specified interval.

  • Interactive Mode: Interactive mode allows developers to have a back-and-forth conversation with the AI model, making it suitable for real-time interactions.

2. Setting Up the Environment

2.1 Prerequisites

Before you begin, ensure that you have the following prerequisites in place:

  • Basic understanding of Python programming.

  • Access to an AI language model or API key for language model services.

2.2 Installing Required Libraries

To use the Flow feature, you'll need to install the required libraries. Make sure you have these libraries installed:

  • termcolor: For colorful console output.

  • inspects: For introspecting the language model.

  • random: For handling dynamic temperature.

  • Other dependencies as needed for your specific environment.

You can install these libraries using pip:


pip install termcolor inspects

2.3 Importing Necessary Modules

In your Python script or environment, import the necessary modules from the Swarms framework:


import json

import logging

import time

from typing import Any, Callable, Dict, List, Optional, Tuple, Generator

from termcolor import colored

import inspect

import random

Ensure that you have these modules imported to proceed with the guide.

3. Creating a Flow Instance

To use the Flow feature, you need to create an instance of the Flow class. This instance will allow you to interact with the AI language model.

3.1 Importing the Required Modules

In your script, import the required modules for the Flow class:


from swarms.structs import Flow

from swarms.models import OpenAIChat  # Adjust this import according to your specific language model.

3.2 Initializing the Language Model

Initialize the language model you want to use for interactions. In this example, we're using the OpenAIChat model:


# Replace 'api_key' with your actual API key or configuration.

llm = OpenAIChat(

    openai_api_key='your_api_key',

    temperature=0.5,

    max_tokens=3000,

)

Make sure to provide the necessary configuration, such as your API key and any model-specific parameters.

3.3 Creating a Flow Instance

Now, create an instance of the Flow class by passing the initialized language model:


flow = Flow(

    llm=llm,

    max_loops=5,

    dashboard=True,

    stopping_condition=None,  # You can define a stopping condition as needed.

    loop_interval=1,

    retry_attempts=3,

    retry_interval=1,

    interactive=False,  # Set to 'True' for interactive mode.

    dynamic_temperature=False,  # Set to 'True' for dynamic temperature handling.

)

This sets up your Flow instance with the specified parameters. Adjust these parameters based on your requirements.

4. Running a Flow

Now that you have created a Flow instance, let's run a simple interaction with the AI model using the Flow.

4.1 Defining the Task

Define the task you want the AI model to perform. This can be any prompt or question you have in mind. For example:


task = "Generate a 10,000 word blog on health and wellness."

4.2 Running the Flow

Run the Flow by providing the task you defined:


out = flow.run(task)

The Flow will interact with the AI model, generate responses, and store the conversation history.

4.3 Interacting with the AI

Depending on whether you set the interactive parameter to True or False during Flow initialization, you can interact with the AI in real-time or simply receive the generated responses in sequence.

If interactive is set to True, you'll have a back-and-forth conversation with the AI, where you provide input after each AI response.

4.4 Dynamic Temperature Handling

If you set the `dynamic_temperature

parameter toTrue` during Flow initialization, the Flow class will handle temperature dynamically. Temperature affects the randomness of responses generated by the AI model. The dynamic temperature feature allows the temperature to change randomly within a specified range, enhancing response diversity.

5. Customizing Flow Behavior

The Flow feature provides various customization options to tailor its behavior to your specific use case.

5.1 Stopping Conditions

You can define custom stopping conditions that instruct the Flow to stop generating responses based on specific criteria. For example, you can stop when a certain keyword appears in the response:


def custom_stopping_condition(response: str) -> bool:

    return "Stop" in response.lower()

# Set the custom stopping condition when creating the Flow instance.

flow = Flow(

    llm=llm,

    max_loops=5,

    stopping_condition=custom_stopping_condition,

    # Other parameters...

)

5.2 Retry Mechanism

In case of errors or issues during AI model interactions, you can configure a retry mechanism. Specify the number of retry attempts and the interval between retries:


flow = Flow(

    llm=llm,

    max_loops=5,

    retry_attempts=3,

    retry_interval=1,

    # Other parameters...

)

5.3 Loop Interval

The loop_interval parameter determines the time delay between consecutive interactions with the AI model. Adjust this value based on your desired pace of conversation.

5.4 Interactive Mode

Set the interactive parameter to True if you want to have real-time conversations with the AI model. In interactive mode, you provide input after each AI response.

6. Saving and Loading Flows

You can save and load Flow instances to maintain conversation history or switch between different tasks.

6.1 Saving a Flow

To save a Flow instance along with its conversation history:


flow.save("path/flow_history.json")

This stores the conversation history as a JSON file for future reference.

6.2 Loading a Saved Flow

To load a previously saved Flow instance:


loaded_flow = Flow(llm=llm, max_loops=5)

loaded_flow.load("path/flow_history.json")

This loads the conversation history into the new Flow instance, allowing you to continue the conversation or analyze past interactions.

7. Analyzing Feedback and Undoing Actions

The Flow feature supports feedback collection and the ability to undo actions.

7.1 Providing Feedback

You can provide feedback on AI responses within the Flow. Feedback can be used to analyze the quality of responses or highlight issues:


flow.provide_feedback("The response was unclear.")

7.2 Undoing the Last Action

If you want to undo the last action taken within the Flow and revert to the previous state, you can use the undo_last method:


previous_state, message = flow.undo_last()

This helps you correct or modify previous interactions.

7.3 Response Filtering

The Flow feature allows you to add response filters to filter out specific words or content from AI responses. This can be useful for content moderation or filtering sensitive information:


flow.add_response_filter("sensitive_word")

The response filters will replace filtered words with placeholders, ensuring that sensitive content is not displayed.

8. Advanced Features

8.1 Streamed Generation

Streamed generation allows you to generate responses token by token in real-time. This can be useful for creating interactive and dynamic conversations:


response = flow.streamed_generation("Generate a report on finance")

This function streams each token of the response with a slight delay, simulating real-time conversation.

8.2 Real-time Token Generation

For even finer control over token generation, you can use the streamed_token_generation method. This generates tokens one by one, allowing you to have precise control over the conversation pace:


for token in flow.streamed_token_generation("Generate a report on finance"):

    print(token, end="")

9. Best Practices

To create effective and user-friendly interactions with the AI model using the Flow feature, consider the following best practices:

9.1 Conciseness and Clarity

Ensure that your prompts and responses are concise and to the point. Avoid unnecessary verbosity.

9.2 Active Voice

Use an active voice when giving instructions or prompts. For example, say, "Generate a report" instead of "A report should be generated."

9.3 Highlighting Important Points

Use formatting options like bold text, italics, or color highlights to draw attention to important points within the conversation.

9.4 Consistent Style

Maintain a consistent tone and style throughout the conversation. If there is a style guide or specific formatting conventions, adhere to them.

10. Conclusion

In conclusion, the Flow feature in the Swarms module provides a versatile and interactive way to interact with AI language models. By following this walkthrough guide and considering the best practices, you can effectively harness the power of Flow for a wide range of applications, from generating content to performing complex tasks.

Start creating your own interactive conversations and enjoy the benefits of seamless AI interactions with the Flow feature. Happy coding!