This content originally appeared on Level Up Coding - Medium and was authored by Michael K
I have been thoroughly impressed by the endless possibilities that GPT-4 and other large language models offer us. This gave me a fantastic idea to try out when I had trouble preparing a dataset for a webinar. I needed a programmatic way to ask simple questions, such as “When did the first Iron Man movie come out?”, and be able to get succinct and accurate answers.
While models like this can sometimes return incorrect data, most of the questions I had to ask were very easy questions to answer such as public figures' birth years or the number of movies they’ve directed. There is also the issue that it has no knowledge of past September 2021, so I can’t really ask it for recent facts. These things considered, I still thought it would be useful and was a fun project.
The Problem
If you’ve ever used a language model before, you know it includes a lot of fluff with its answers and will sometimes give you way too much information when all you want is a simple answer. Let’s take the example below and see how it responds:
[Prompt] What year was James Cameron, the director, born?
[Answer] James Cameron, the renowned director, was born on August 16, 1954.
But if we were preparing a dataset, we can’t exactly insert this response as it contains a lot of additional information and data we don’t need such as his name, which we already know. Fortunately for us, we can instruct language models like this to respond exactly how we would like. I’ve seen great examples of this before, where you can have it respond in simple Yes/No terms, or even a percentage:
[Prompt] Is the color blue in the rainbow?
[Answer] Yes, the color blue is in the rainbow. A rainbow consists of a
spectrum of colors, including red, orange, yellow, green, blue,
indigo, and violet.
[Prompt] Is the color blue in the rainbow? Respond only with Yes or No.
[Answer] Yes
What we need to do is tell the model to only reply with the answer and not include any additional context. We can do this by adding extra instructions to our prompt:
[Prompt] What year was James Cameron, the director, born? Your reply
should only be the answer itself, no additional text - this
includes describing the answer.
[Answer] 1954
Perfect! We could now use the response from the API and use this to build up any data we are missing, such as the directors' birth years in my case. But what if the model doesn't know the answer to the question? We also need to account for this case as we can see below:
[Prompt] Who directed the tenth Shrek movie? Your reply should only be
the answer itself, no additional text - this includes
describing the answer.
[Answer] As of my last knowledge update in September 2021, there were
only four Shrek movies released, with the last one,
Shrek Forever After, released in 2010. There is no information
on a tenth Shrek movie, and it's possible that it hasn't been
made yet.
In the same manner that we can tell it how to reply, we can also give it canned responses for certain scenarios. This way, we can detect this case easily in our code and handle it correctly:
[Prompt] Who directed the tenth Shrek movie? Your reply should only be
the answer itself, no additional text - this includes
describing the answer. If you do not know the answer to the
question, respond with the sentence "I do not know the answer
to that question." and nothing else.
[Answer] I do not know the answer to that question.
Differences between models
Depending on which language model you are using, it will respond and parse the prompts differently. For example, when using GPT-3.5 and these examples, it includes periods at the end of its answers but this doesn’t occur with GPT-4. You may also have to change the prompt slightly if the model doesn’t seem to understand or still provides too verbose of answers — but I didn’t seem to run into this issue with OpenAI’s models.
Implementation
Now that we’ve got the model responding as we want, we need to use one of OpenAI’s libraries to communicate with the API and automate the process. There are official clients for NodeJS and Python, but there are also many unofficial community libraries for other languages.
I’m using the Python client today, but you could implement this same code on any library as it’s a quite straightforward use case. The first thing I’ll do is set up my prompts for the system:
from datetime import datetime
from time import mktime, gmtime
def get_timestamp():
return datetime.fromtimestamp(mktime(gmtime())).isoformat()
system_prompts = [
'You are a helpful assistant who answers questions for Users.',
'Your reply should only be the answer itself, no additional text - this includes describing the answer.',
'If you do not know the answer to the question, respond with the sentence "I do not know the answer to that question." and nothing else.',
f'Knowledge Cutoff: September 2021. Current Date and Time: {get_timestamp()} GMT'
]
We can then create a ChatCompletion request to send our system prompt and our User’s question to OpenAI’s API:
import openai
# Pose our question
question = 'Who directed the eleventh Madagascar movie?'
# Submit the request
completion = openai.ChatCompletion.create(
#model='gpt-3.5-turbo', # If you don't have access to GPT-4
model='gpt-4',
temperature=0.2, # Generate more stable responses
messages=[
# System prompts
{
'role': 'user',
'content': ' '.join(system_prompts)
},
# User's question
{
'role': 'user',
'content': question
}
]
)
# Get our answer
answer = completion.choices[0].message.content
print(answer)
# Output: I do not know the answer to that question.
Improvements
We can then improve this code by turning it into a function and adding some error handling. It’s also possible that we ask the model an invalid question, so we’ll want to handle this as well:
from re import sub
class RequestFailed(Exception):
pass
class UnknownAnswer(Exception):
pass
def ask_question(question, model='gpt-3.5-turbo', temperature=0.2):
# Submit the request
completion = openai.ChatCompletion.create(
model=model,
temperature=temperature,
messages=[
# System prompts
{
'role': 'user',
'content': ' '.join(system_prompts)
},
# User's question
{
'role': 'user',
'content': question
}
]
)
# Ensure we received a response
if not len(completion.choices):
raise RequestFailed('Unable to fetch an answer from the model.')
# Get just the answer and trim the trailing period (for GPT-3.5)
answer = sub(r'\.$', '', completion.choices[0].message.content)
# Check if it knows the answer
if 'I do not know the answer to that question' in answer:
raise UnknownAnswer('Question could not be answered.')
# Ensure this is a valid question
if 'not a valid question' in answer:
raise UnknownAnswer('Invalid question posed.')
return answer
Then, where we use this function we just need to expect three different types of exceptions: UnknownAnswer, RequestFailed, or OpenAIError:
answer = None
question = 'What year was James Cameron born in?'
try:
answer = ask_question(question, model='gpt-4')
except openai.error.OpenAIError as e:
print(question, 'request error:', e.user_message)
except UnknownAnswer as e:
print(question, 'could not be answered:', e)
except RequestFailed as e:
print(question, 'request failed:', e)
if answer:
print(question, answer)
# Output: What year was James Cameron born in? 1954
Conclusion
Learning to engineer and craft responses from language models like GPT-4 will continue to be an important and exciting skill to master in the coming years as they evolve and develop new capabilities. And I know in my case, I’m always looking to save time so anything I can do to speed up my work is a godsend.
Was there anything else I could’ve done to improve these prompts? I would love to hear your feedback in the comments!
Resources
Level Up Coding
Thanks for being a part of our community! Before you go:
- 👏 Clap for the story and follow the author 👉
- 📰 View more content in the Level Up Coding publication
- 💰 Free coding interview course ⇒ View Course
- 🔔 Follow us: Twitter | LinkedIn | Newsletter
🚀👉 Join the Level Up talent collective and find an amazing job
Prompt crafting with GPT-4 to make my life easier was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.
This content originally appeared on Level Up Coding - Medium and was authored by Michael K
Michael K | Sciencx (2023-03-30T15:41:22+00:00) Prompt crafting with GPT-4 to make my life easier. Retrieved from https://www.scien.cx/2023/03/30/prompt-crafting-with-gpt-4-to-make-my-life-easier/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.