How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide

ChatGPT has been making headlines lately as one of the most advanced and widely-used language models. Its ability to understand natural language and generate human-like responses has made it a popular tool for businesses looking to improve their cust…


This content originally appeared on Twilio Blog and was authored by Ezzeddin Abdullah

ChatGPT has been making headlines lately as one of the most advanced and widely-used language models. Its ability to understand natural language and generate human-like responses has made it a popular tool for businesses looking to improve their customer service through AI chatbots.

In this tutorial, you'll learn how to build an AI chatbot with the OpenAI API that can engage with customers on WhatsApp. With this chatbot, you'll be able to provide customers with intelligent responses to their inquiries. This AI chatbot is a ChatGPT-like interface but on WhatsApp.

You'll start by setting up the backend using FastAPI and SQLAlchemy to create a PostgreSQL database to store your customers' conversations. Then, you'll integrate Twilio's WhatsApp Messaging API, allowing customers to initiate conversations with your WhatsApp chatbot.

With Pyngrok, you'll put the FastAPI localhost on the internet through Python, making it accessible for the Twilio API to communicate with.

Finally, the core of this AI chatbot will be built using OpenAI's API and one of the models in the GPT-3.5 series.

Prerequisites

To follow this tutorial, you will need the following prerequisites:

  • Python 3.7+ installed on your machine.
  • PostgreSQL installed on your machine.
  • A Twilio account set up. If you don't have one, you can create a free account here.
  • An OpenAI API key.
  • A smartphone with WhatsApp installed to test your AI chatbot.
  • A basic understanding of FastAPI, a modern, fast (high-performance), web framework for building APIs with Python 3.6+.
  • A basic understanding of what an ORM is. If you are not familiar with ORM, we recommend you read this wiki page to get an idea of what it is and how it works.

Setting up your development environment

Before building the chatbot, you need to set up your development environment. Start with creating a new virtual environment:

mkdir ai_chatbot
cd ai_chatbot
python3 -m venv venv; . venv/bin/activate; pip install --upgrade pip

Here, you create the ai_chatbot directory and navigate into it. Then you create a new Python virtual environment using venv. Finally, you activate the environment and then upgrade pip, the Python package manager.

Next, create a requirements.txt file that includes the following:

fastapi
uvicorn
twilio
openai
python-decouple
sqlalchemy
psycopg2-binary
python-multipart
pyngrok

Here is a breakdown of these dependencies:

  1. fastapi: A package for FastAPI, a modern web framework for building APIs with Python 3.7+ based on standard Python type hints. It's designed to be easy to use, fast, and to provide automatic validation of request and response data.
  2. uvicorn: A package for Uvicorn, a fast ASGI server implementation, using the websockets library for long-polling connections, and based on uvloop and httptools.
  3. twilio: A Python helper library for Twilio, a cloud communications platform that allows software developers to programmatically make and receive phone calls, send and receive text messages, and perform other communication functions using its web service APIs.
  4. openai: A Python client for OpenAI, a research company that focuses on developing and advancing artificial intelligence in a way that is safe and beneficial for humanity. OpenAI offers various AI models, including the text-davinci-002 model, which is used in this tutorial to power the chatbot.
  5. python-decouple: A library for separating the settings of your Python application from the source code. It allows you to store your settings in an environment file, instead of hardcoding them into your code.
  6. sqlalchemy: A package for SQLAlchemy, a Python SQL toolkit and Object-Relational Mapping (ORM) library. It provides a set of high-level APIs for connecting to relational databases, executing SQL queries, and mapping database tables to Python classes.
  7. psycopg2-binary: A Python package that provides a PostgreSQL database adapter for the Python programming language.
  8. python-multipart: A library that allows you to parse multipart form data in Python, which is commonly used to handle form submissions that contain files such as images or videos. In the case of this tutorial, it will be used to handle form data from the user's input through the WhatsApp chatbot.
  9. pyngrok: A Python wrapper for ngrok, a tool that allows you to expose a web server running on your local machine to the internet. You'll use it to test your Twilio webhook while you send WhatsApp messages.

Now, you can install these dependencies:

pip install -r requirements.txt

Configuring your database

You can use your own PostgreSQL database or set up a new database with the createdb PostgreSQL utility command:

createdb mydb

In this tutorial, you will SQLAlchemy to access the PostgreSQL database. Put the following into a new models.py file:

from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.engine import URL
from sqlalchemy.orm import declarative_base, sessionmaker
from decouple import config


url = URL.create(
    drivername="postgresql",
    username=config("DB_USER"),
    password=config("DB_PASSWORD"),
    host="localhost",
    database="mydb",
    port=5432
)

engine = create_engine(url)
SessionLocal = sessionmaker(bind=engine)
Base = declarative_base()

class Conversation(Base):
    __tablename__ = "conversations"

    id = Column(Integer, primary_key=True, index=True)
    sender = Column(String)
    message = Column(String)
    response = Column(String)


Base.metadata.create_all(engine)

This code sets up a connection to a PostgreSQL database using SQLAlchemy and creates a table named conversations. Here's a breakdown of what each part does:

  • URL.create creates the URL object which is used as the argument for the create_engine function. Here, it specifies the drivername, username, password, host, database, and port of the database.
  • create_engine function creates an engine object that manages connections to the database using a URL that contains the connection information for the database.
  • sessionmaker is a factory for creating Session objects that are used to interact with the database.
  • declarative_base is a factory function that returns a base class that can be subclassed to define mapped classes for the ORM.
  • The Conversation class is a mapped class that inherits from Base and maps to the conversation table. It has four columns: id, sender, message, and response. id is the primary key column, sender is a string column that holds the sender phone number the message is sent from, message is a string column that holds the message text, and response is a string column that holds the response message coming from GPT-3.5.
  • Base.metadata.create_all creates all tables in the database (in this case, it creates the conversations table) if they do not exist.

So the goal of this simple model is to store conversations for your app.

Note: here, you've used decouple.config to access the environment variables for your database: DB_USER and DB_PASSWORD. You should now create a .env file that stores these credentials with their associated values. Something like the following, but replacing the placeholder text with your actual values:

DB_USER=<your-database-username>
DB_PASSWORD=<your-database-password>

Creating your chatbot

Now that you have set up your environment and created the database, it's time to build the chatbot. In this section, you will write the code for a basic chatbot using OpenAI and Twilio.

Configuring your Twilio Sandbox for WhatsApp

To use Twilio's Messaging API to enable the chatbot to communicate with WhatsApp users, you need to configure the Twilio Sandbox for WhatsApp. Here's how to do it:

  1. Assuming you've already set up a new Twilio account, go to the Twilio Console and choose the Messaging tab on the left panel.
  2. Under Try it out, click on Send a WhatsApp message. You'll land on the Sandbox tab by default and you'll see a phone number "+14155238886" with a code to join next to it on the left and a QR code on the right.
  3. To enable the Twilio testing environment, send a WhatsApp message with this code's text to the displayed phone number. You can click on the hyperlink to direct you to the WhatsApp chat if you are using the web version. Otherwise, you can scan the QR code on your phone.

Now, the Twilio sandbox is set up, and it's configured so that you can try out your application after setting up the backend.

Before leaving the Twilio Console, you should take note of your Twilio credentials and edit the .env file as follows:

DB_USER=<your-database-username>
DB_PASSWORD=<your-database-password>
TWILIO_ACCOUNT_SID=<your-twilio-account-sid>
TWILIO_AUTH_TOKEN=<your-twilio-auth-token>
TWILIO_NUMBER=+14155238886

Setting up your Twilio WhatsApp API snippet

Before setting up the FastAPI endpoint to send a POST request to WhatsApp, let's build a utility script first to set up sending a WhatsApp message through the Twilio Messaging API.

Create a new file called utils.py and fill it with the following code:

# Standard library import
import logging

# Third-party imports
from twilio.rest import Client
from decouple import config


# Find your Account SID and Auth Token at twilio.com/console
# and set the environment variables. See http://twil.io/secure
account_sid = config("TWILIO_ACCOUNT_SID")
auth_token = config("TWILIO_AUTH_TOKEN")
client = Client(account_sid, auth_token)
twilio_number = config('TWILIO_NUMBER')

# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

# Sending message logic through Twilio Messaging API
def send_message(to_number, body_text):
    try:
        message = client.messages.create(
            from_=f"whatsapp:{twilio_number}",
            body=body_text,
            to=f"whatsapp:{to_number}"
            )
        logger.info(f"Message sent to {to_number}: {message.body}")
    except Exception as e:
        logger.error(f"Error sending message to {to_number}: {e}")

First, the necessary libraries are imported, which include the logging library, the Twilio REST client, and the decouple library used to store private credentials in a .env file.

Next, the Twilio Account SID, Auth Token, and phone number are retrieved from the .env file using the decouple library. The Account SID and Auth Token are required to authenticate your account with Twilio, while the phone number is the Twilio WhatsApp sandbox number.

Then, a logging configuration is set up for the function to log any info or errors related to sending messages. If you want more advanced logging to use as a boilerplate, check this out.

The meat of this utility script is the send_message function which takes two parameters, the to_number and body_text, which are the recipient's WhatsApp number and the message body text, respectively.

The function tries to send the message using the client.messages.create method, which takes the Twilio phone number as the sender (from_), the message body text (body), and the recipient's WhatsApp number (to). If the message is successfully sent, the function logs an info message with the recipient's number and the message body. If there is an error sending the message, the function logs an error message with the error message.

Setting up your FastAPI backend

To set up the FastAPI backend for the chatbot, navigate to the project directory and create a new file called main.py. Inside that file, you will set up a basic FastAPI application that will handle a single incoming request:

from fastapi import FastAPI

app = FastAPI()

@app.get("/")
async def index():
    return {"msg": "up & running"}

To run the app, run the following command:

uvicorn main:app --reload

Open your browser to http://localhost:8000. The result you should see is a JSON response of {"msg": "up & running"}.

However, since Twilio needs to send messages to your backend, you need to host your app on a public server. An easy way to do that is to use Ngrok.

If you're new to Ngrok, you can consult this blog post and create a new account.

Leave the FastAPI app running on port 8000, and run this ngrok command:

ngrok http 8000

The above command sets up a connection between your local server running on port 8000 and a public domain created on the ngrok.io website. Once you have the Ngrok forwarding URL, any requests from a client to that URL will be automatically directed to your FastAPI backend.

ngrok terminal

If you click on the forwarding URL, Ngrok will redirect you to your FastAPI app's index endpoint. It's recommended to use the https prefix when accessing the URL.

Configuring the Twilio webhook

You must set up a Twilio-approved webhook to be able to receive a response when you message the Twilio WhatsApp sandbox number.

To do that, head over to the Twilio Console and choose the Messaging tab on the left panel. Under the Try it out tab, click on Send a WhatsApp message. Next to the Sandbox tab, choose the Sandbox settings tab.

Copy the ngrok.io forwarding URL and append /message. Paste it into the box next to WHEN A MESSAGE COMES IN:

WhatsApp sandbox settings

The complete URL should look like this: https://d8c1-197-36-101-223.ngrok.io/message.

The endpoint you will configure in the FastAPI application is /message, as noted. The chatbot logic will be on this endpoint.

When done, press the Save button.

Sending your message with OpenAI API

Now, it's time to create the logic for sending the WhatsApp message to the OpenAI API so that you'll get a response from the AI chatbot.

Update the main.py script to the following:

# Third-party imports
import openai
from fastapi import FastAPI, Form, Depends
from decouple import config
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import Session

# Internal imports
from models import Conversation, SessionLocal
from utils import send_message, logger


app = FastAPI()
# Set up the OpenAI API client
openai.api_key = config("OPENAI_API_KEY")
whatsapp_number = config("TO_NUMBER")

# Dependency
def get_db():
    try:
        db = SessionLocal()
        yield db
    finally:
        db.close()

@app.post("/message")
async def reply(Body: str = Form(), db: Session = Depends(get_db)):
    # Call the OpenAI API to generate text with GPT-3.5
    response = openai.Completion.create(
        engine="text-davinci-002",
        prompt=Body,
        max_tokens=200,
        n=1,
        stop=None,
        temperature=0.5,
    )

    # The generated text
    chat_response = response.choices[0].text.strip()

    # Store the conversation in the database
    try:
        conversation = Conversation(
            sender=whatsapp_number,
            message=Body,
            response=chat_response
            )
        db.add(conversation)
        db.commit()
        logger.info(f"Conversation #{conversation.id} stored in database")
    except SQLAlchemyError as e:
        db.rollback()
        logger.error(f"Error storing conversation in database: {e}")
    send_message(whatsapp_number, chat_response)
    return ""

You've set up the /message endpoint so that the app will listen to the incoming POST requests to that endpoint and generate a response using the OpenAI API and GPT-3.5 model.

The code imports several third-party libraries, including openai, FastAPI, decouple, and SQLAlchemy. It also imports two modules defined in the same directory: models.py and utils.py.

The openai.api_key variable is set to the value of the OPENAI_API_KEY environment variable using the config() function from the decouple library. The whatsapp_number variable is also set using the config() function. Make sure to update your .env file with these variables. The .env file should now look like the following:

DB_USER=<your-database-username>
DB_PASSWORD=<your-database-password>
TWILIO_ACCOUNT_SID=<your-twilio-account-sid>
TWILIO_AUTH_TOKEN=<your-twilio-auth-token>
TWILIO_NUMBER=+14155238886
OPENAI_API_KEY=<your-openai-api-key>
TO_NUMBER=<your-whatsapp-phone-number>

A get_db() function is defined as a dependency using the Depends decorator from FastAPI. This function creates a new database session using the SessionLocal function from models.py and yields it to the calling function. Once the calling function completes, the database session is closed using the finally block.

The main function of the code is the reply() function, which is decorated with the @app.post('/message') decorator. This function takes in a message body as a parameter and a database session object obtained from the get_db() dependency.

Inside the function, the OpenAI API is called using the openai.Completion.create() method, passing in the message body as a prompt. The text-davinci-002 model is used to generate a response, which is stored in the chat_response variable.

The generated response is stored in the database using the Conversation model defined in models.py. If there is an error storing the conversation in the database, the transaction is rolled back using the db.rollback() method.

Finally, the generated response is sent back to the user's WhatsApp number using the send_message() function defined in utils.py. The function is not expected to return any meaningful response so you make it return an empty string.

Testing your AI chatbot

Now, you're ready to send a WhatsApp message and wait for a response from your AI chatbot. Try asking the AI chatbot anything you can ask ChatGPT.

The example below shows a couple of questions and their responses:

WhatsApp AI Chatbot Conversation

Now, your AI chatbot is functioning well on WhatsApp. Perhaps your next step is to make it live in production using a VPS instead of building it locally. I hope you enjoyed this tutorial and see you in the next one.

Ezz is an AWS Certified Machine Learning Specialist and a Data Platform Engineer. He helps tech companies generate developer leads through his technical blog posts. Check out his website for more.


This content originally appeared on Twilio Blog and was authored by Ezzeddin Abdullah


Print Share Comment Cite Upload Translate Updates
APA

Ezzeddin Abdullah | Sciencx (2023-03-13T17:46:20+00:00) How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide. Retrieved from https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/

MLA
" » How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide." Ezzeddin Abdullah | Sciencx - Monday March 13, 2023, https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/
HARVARD
Ezzeddin Abdullah | Sciencx Monday March 13, 2023 » How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide., viewed ,<https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/>
VANCOUVER
Ezzeddin Abdullah | Sciencx - » How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/
CHICAGO
" » How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide." Ezzeddin Abdullah | Sciencx - Accessed . https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/
IEEE
" » How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide." Ezzeddin Abdullah | Sciencx [Online]. Available: https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/. [Accessed: ]
rf:citation
» How to Build an AI Chatbot for WhatsApp with Python, Twilio, and OpenAI: A Step-by-Step Guide | Ezzeddin Abdullah | Sciencx | https://www.scien.cx/2023/03/13/how-to-build-an-ai-chatbot-for-whatsapp-with-python-twilio-and-openai-a-step-by-step-guide/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.