🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨

Introduction: The Power of AI-Assisted Communication in Webex

In today’s fast-paced digital workplace, effective communication is key. Imagine having an intelligent assistant in your Cisco Webex Teams that can search the web, retrieve user i…


This content originally appeared on DEV Community and was authored by Evgenii

Introduction: The Power of AI-Assisted Communication in Webex

In today's fast-paced digital workplace, effective communication is key. Imagine having an intelligent assistant in your Cisco Webex Teams that can search the web, retrieve user information, and perform calculations - all while maintaining context across conversations. That's exactly what we're going to build in this tutorial!

By the end of this guide, you'll have created a powerful AI-powered Webex bot using LangGraph's stateful large language model (LLM) agents. This bot will enhance your team's interactions within Webex by providing intelligent responses and performing complex tasks efficiently.

Project Overview: What We're Building and Why

Our Webex Bot AI Assistant will leverage the following key technologies:

  • 🧠 LangChain & LangGraph: For building stateful, multi-actor AI applications
  • 🚀 OpenAI: Providing the underlying language model
  • 🤖 Webex Bot lib: For seamless integration with Webex Teams leveraging WebSockets
  • đź“Š SQLite: Lightweight database for maintaining conversation history

Key Features:

  • 🔍 Web search capabilities with Tavily Search
  • 👤 Retrieval of Webex user information
  • 🧮 Mathematical operations (e.g., calculating a number's power)
  • đź’ľ Persistent conversation history for improved context awareness

Prerequisites:

  • Basic knowledge of Python
  • Understanding of bot frameworks (helpful, but not required)
  • Understanding of LangGraph agents (helpful, but not required)

Let's dive in and start building!

Setting Up Your Development Environment

Before we start coding, let's set up our environment:

1. Clone the project repository:

   git clone https://github.com/lieranderl/webexbot-langgraph-assistant-template

2. Create a .env file in the project root with the following variables:

   # LLM ENVIRONMENT
   OPENAI_API_BASE=<OpenAI API base URL>
   OPENAI_API_KEY=<Your OpenAI API key>

   # LANGCHAIN ENVIRONMENT
   LANGCHAIN_ENDPOINT=<LangSmith endpoint URL>
   LANGCHAIN_API_KEY=<Your LangSmith API key>
   LANGCHAIN_PROJECT=<Name of your LangSmith project>

   # Tools API
   TAVILY_API_KEY=<Your Tavily API key for web search>

   # SQLite Database for Conversation History
   SQL_CONNECTION_STR=checkpoints.db

   # Webex Bot Configuration
   WEBEX_TEAMS_ACCESS_TOKEN=<Your Webex Bot Access Token>
   WEBEX_TEAMS_DOMAIN=<Your Webex Domain>

3. Install dependencies using Poetry:

   poetry install

Now that our environment is set up, let's start building our AI assistant!

Crafting the Brain: Building the ReAct Agent with LangGraph

The heart of our AI assistant is the ReAct agent, which we'll create using LangGraph and LangChain. This agent will process user inputs, make decisions, and generate responses.

Before diving into the detailed setup, it's helpful to understand the structure and flow of the ReAct agent we'll be building. The following image illustrates the structure and flow of the ReAct agent used in this project:

Image description

Let's break down the key components in our graph.py file:

1. Importing necessary libraries:

from datetime import datetime, timezone
from typing import Any
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent, ToolNode
from langchain_core.messages import ToolMessage
from langchain_core.prompts import ChatPromptTemplate
from .tools import search, get_webex_user_info, power

2. Creating a ToolNode with our custom tools:

tools = ToolNode([search, get_webex_user_info, power])

3. Initializing the OpenAI model:

model = ChatOpenAI(model="gpt-4o")

4. Defining the system prompt:

SYSTEM_PROMPT = """You are a helpful AI assistant in a Webex chat. 
Your primary tasks include:
1. Searching the web to provide relevant information.
2. Retrieving personal information from Webex.
3. Performing power calculations.
4. Interacting with users to assist with their queries.
Ensure your responses are clear, accurate, and concise. Always aim to provide the most helpful information based on the user's request.
System time: {system_time}"""

5. Creating the chat prompt template:

prompt = ChatPromptTemplate.from_messages(
    [
        ("system", SYSTEM_PROMPT),
        ("placeholder", "{messages}"),
    ]
)

6. Formatting the state for the model:

def format_for_model(state) -> Any:
    messages = state["messages"][-5:]
    if isinstance(state["messages"][-5], ToolMessage):
        messages = state["messages"][-6:]
    return prompt.invoke(
        {
            "messages": messages,
            "system_time": datetime.now(timezone.utc).astimezone().isoformat(),
        }
    )

System Time in Prompt:
We include the current system time in the prompt for several reasons:

  • It helps the AI understand the temporal context of the conversation.
  • It allows the bot to provide time-sensitive information accurately.

Message Filtering:
We filter messages to reduce the number passed to the model.
This is crucial because:

  • It helps manage token usage, which can be a significant cost factor.
  • It maintains recent context without overwhelming the model with too much history.
  • It ensures that tool messages are properly contextualized with their preceding messages.

State Modification:
The format_for_model function modifies the state before it's passed to the model:

7. Creating the ReAct agent:

graph = create_react_agent(model, tools, state_modifier=format_for_model)

We're using create_react_agent from langgraph.prebuilt to create a graph that works with a chat model utilizing tool calling. You can find more information about this function in the LangGraph Documentation

This function is crucial because it:

  • Prepares the input in a format the model expects.
  • Includes only relevant information, improving efficiency and response quality.
  • Adds dynamic information like the current time to each interaction.

8. Set the name of the ReAct agent

Set the name of the ReAct agent to track LangSmith:

# Set the name of the ReAct agent to track LangSmith
graph.name = "Webex Bot Demo ReAct Agent"

This setup creates a powerful ReAct agent that can understand context, use tools, and generate appropriate responses.

Empowering Your Bot: Implementing Custom Tools

Our bot's capabilities are enhanced by custom tools. Let's look at the key tools in our tools.py file:

1. Web Search Tool:

def search(
    query: str, *, config: Annotated[RunnableConfig, InjectedToolArg]
) -> Optional[List[Dict[str, Any]]]:
    max_results: int = config.get("configurable", {}).get("max_results") or 5
    wrapped = TavilySearchResults(max_results=max_results)
    result = wrapped.invoke({"query": query})
    return cast(List[Dict[str, Any]], result)

2. Power Calculation Tool:

def power(a: int, b: int) -> int:
    return a**b

3. Webex User Info Retrieval Tool:

def get_webex_user_info(
    config: Annotated[RunnableConfig, InjectedToolArg],
) -> Optional[Dict[str, str]]:
    displayName: str = config.get("configurable", {}).get("displayName") or ""
    email: str = config.get("configurable", {}).get("email") or ""
    return {
        "displayName": displayName,
        "email": email,
    }

These tools allow our bot to perform web searches, mathematical calculations, and retrieve user information from Webex.

Maintaining Context: Checkpointing with SQLite

To ensure our bot maintains context across conversations, we use SQLite for checkpointing. Here's how we set it up in our invoker.py file:

def graph_db_invoke(message: str, **kwargs: Any) -> dict[str, Any] | Any:
    connection = os.getenv("SQL_CONNECTION_STR") or "checkpoints.db"
    with SqliteSaver.from_conn_string(connection) as saver:
        graph.checkpointer = saver
        run_id = uuid4()
        config = RunnableConfig(
            configurable={
                "model": "gpt-4o-mini",
                "temperature": 0.1,
                **kwargs,
            },
            run_id=run_id,
        )
        return graph.invoke(
            input={"messages": HumanMessage(content=message)},
            config=config,
        )

This function invokes our graph with checkpointing, ensuring that the conversation state persists between interactions.

Bringing It All Together: Integrating with Webex Teams

Now, let's integrate our AI assistant with Webex Teams.

First, We need to create and register the bot on Webex. For instructions on how to create and register your bot, please follow the Webex Bot guide

In our webexbot.py file:

from dotenv import load_dotenv
from webex_bot.webex_bot import WebexBot
from .commands import OpenAI

load_dotenv()

bot = WebexBot(
    teams_bot_token=os.getenv("WEBEX_TEAMS_ACCESS_TOKEN"),
    approved_domains=[os.getenv("WEBEX_TEAMS_DOMAIN")],
    bot_name="AI-Assistant",
    bot_help_subtitle="",
    threads=False,
    help_command=OpenAI(),
)

bot.run()

And in our commands.py file:

class OpenAI(Command):
    def __init__(self):
        super().__init__()

    def execute(self, message, attachment_actions, activity):
        response = graph_db_invoke(
            message,
            thread_id=activity["target"]["globalId"],
            email=activity["actor"]["id"],
            displayName=activity["actor"]["displayName"],
            max_results=5,
        )
        return response["messages"][-1].content

graph_db_invoke: This function processes the input message and generates a response using the ReAct agent.

Parameters we are passing:
message: The input message from the user.
thread_id: The global ID of the conversation thread. In this context, it is the ID of the Webex Room.
email: The email address of the user who sends the message, retrieved using the Webex SDK.
displayName: The display name of the user who sends the message, retrieved using the Webex SDK.
max_results: The maximum number of Tavily search results to return.

This setup allows our bot to process all text inputs using our ReAct agent and respond within Webex Teams.

🚀 Running your AI Assistant

This project uses Poetry for dependency management and Poe the Poet for task running. Read more in readme

To start your Webex Bot AI Assistant, use the following command:

poe start

or

poetry run python -m webexbot.webexbot

Your bot is now live and ready to assist users in Webex Teams!

Example Interaction

Image description

Supercharging Development: Tracing and Analyzing with LangSmith

To gain insights into your bot's performance and behavior, we've integrated LangSmith for tracing. This allows you to visualize the decision-making process of your AI assistant and optimize its performance.

Image description

Conclusion: Your Webex Bot AI Assistant in Action

Congratulations! You've successfully built a powerful Webex Bot AI Assistant using LangGraph's stateful LLM agents. Your bot can now:

  • Understand and maintain context in conversations
  • Perform web searches for up-to-date information
  • Retrieve Webex user information
  • Calculating a number's power
  • Provide intelligent responses to user queries

This AI-powered assistant will significantly enhance communication and productivity within your Webex Teams environment.

Next Steps: Enhancing and Expanding Your Bot's Capabilities

To further improve your Webex Bot AI Assistant, consider:

  1. Adding more custom tools to expand its capabilities
  2. Implementing user authentication for sensitive operations
  3. Integrating with other APIs or services relevant to your organization
  4. Fine-tuning the AI model for your specific use case
  5. Implementing multi-language support

By continually refining and expanding your bot's abilities, you'll create an invaluable asset for your team's Webex communications.

Happy coding, and enjoy your new AI-powered Webex assistant!

Troubleshooting

If you encounter any issues while setting up or running your bot, check these common problems:

  1. API Key Issues: Ensure all API keys in your .env file are correct and up-to-date.
  2. Dependency Conflicts: Make sure you're using the correct versions of all libraries. Check the pyproject.toml file for version specifications.
  3. Webex Integration: If your bot isn't responding in Webex, verify that your bot's access token is correct and that it has the necessary permissions.

For more specific issues, consult the documentation of the relevant libraries or reach out to the community on dev.to or GitHub.

Resources

Remember, building AI-powered bots is an iterative process. Don't be afraid to experiment, learn, and improve your bot over time. Good luck with your project!


This content originally appeared on DEV Community and was authored by Evgenii


Print Share Comment Cite Upload Translate Updates
APA

Evgenii | Sciencx (2024-10-12T22:07:03+00:00) 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨. Retrieved from https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/

MLA
" » 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨." Evgenii | Sciencx - Saturday October 12, 2024, https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/
HARVARD
Evgenii | Sciencx Saturday October 12, 2024 » 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨., viewed ,<https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/>
VANCOUVER
Evgenii | Sciencx - » 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/
CHICAGO
" » 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨." Evgenii | Sciencx - Accessed . https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/
IEEE
" » 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨." Evgenii | Sciencx [Online]. Available: https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/. [Accessed: ]
rf:citation
» 🚀 Building a Smart Cisco Webex Bot: Harnessing LangGraph’s Stateful LLM Agents for AI-Powered Assistance 🤖✨ | Evgenii | Sciencx | https://www.scien.cx/2024/10/12/%f0%9f%9a%80-building-a-smart-cisco-webex-bot-harnessing-langgraphs-stateful-llm-agents-for-ai-powered-assistance-%f0%9f%a4%96%e2%9c%a8/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.