Summary

To get better responses from ChatGPT, you should provide detailed and specific prompts that give it enough information and context to understand what you’re looking for. Avoid vague or ambiguous prompts and be aware that ChatGPT may not have access to all the information you need. Additionally, back-and-forth conversations with ChatGPT will refine its output.

For an AI that supposedly knows so much, it can be tricky to get the answers you need from this chatbot, but before you run back to Google search, here are some issues you’re able to address to get better answers.

Your Prompts Are Too Short

SinceChatGPTis (notionally) a chatbot, you may have the preconceived idea that your prompts should be short and conversational. While there’s nothing wrong with that, of course, consider that you may not be giving ChatGPT enough information for it to give you the responses you want.

Related:ChatGPT vs. Bing Chat AI: Which Is Better?

Go into detail and really explain what it is you want (positive prompts) and what you don’t want (negative prompts), while painting a broader picture for the AI to work with. It’s better to err on the side of writing a prompt that’s longer and more detailed than you think is necessary.

You’re Not Being Specific

Large Language Models (LLMs) like ChatGPT are packed with an unimaginable amount of knowledge and data, which makes it extremely hard to give you the output you actually wanted if your questions are too broad.

For example, these prompts are too vague to be useful:

Instead, you’d use prompts like these instead:

Both of these sets of examples were provided by ChatGPT using the prompts “I need some examples of ChatGPT prompts that aren’t specific enough to get the output users intended” and “For each of these, provide better versions that are specific enough.” Don’t be afraid to use ChatGPT to create prompts or to ask it for examples of good prompts you can repurpose

By asking specific questions, you’ll getmuch better responses, and the more specific you get, the better they’ll be in most cases.

Ambiguous Prompts

One of the main reasons LLMs are so amazing is that human language is complicated, and that’s the understatement of the century! Unfortunately, that complexity also means it’s easy to have lots of ambiguity in your prompts.

Related:8 Surprising Things You Can Do With ChatGPT

An ambiguous prompt is one that can be interpreted in multiple ways that are equally valid. Sometimes this is a problem with the logic or phrasing of your prompt, but most often it’s simply because you’ve asked a question with so many answers that it’s hard for ChatGPT to know which answer you’re actually looking for.

For example, if you ask “what’s the best way to cook chicken?” ChatGPT has to grapple with the different ways something could be “best.”

If, on the other hand, you asked “What’s the best way to cook chicken for my health?”  you’d narrow things down, and if you asked “What’s the best way to cook chicken for someone with diabetes?” you’d really be zeroing in on what you actually need.

A Lack of Context

Ambiguous prompts mainly suffer from a lack of context, but almost any type of prompt for ChatGPT will benefit from adding more context. ChatGPT is highly sensitive to contextual cues, so the more context you provide, the better your results will be.

You can actually see this clearly when asking for something like an outline for writing. If you ask for an outline for a blog article, you’ll get a very different result compared to asking for an outline for a book or an academic article.

Related:How to Save and Share Your ChatGPT Conversations

If you ask ChatGPT to convert text into a script for a YouTube video, the output is completely different than what you get when asking for a script meant for a TV show.

These are simple examples, but ChatGPT can pick up on nuanced contextual clues, so it’s a good idea to get into the habit of elaborating and describing what you want to with terms and keywords that give the software clues as to what you’re trying to get from it.

It Doesn’t Have the Right Information

While LLMs like ChatGPT have a lot of data to work with, they have clear limits to what they know or can know. Apart from ChatGPT’s (current) training data limit of September 2021, there are some things it just can’t know.

you’re able to tell it the facts it needs to know for your specific situation or even copy and paste text from sources you want it to use. This new knowledge won’t persist in future chats unless yousave themand reintroduce the information.

ChatGPT will “believe” whatever you tell it to for the purposes of generating your prompts. So you need to make sure any information you feed it is accurate for your purposes.

Also, don’t forget that ChatGPT can make things up, give you illogical or incorrect information, and generally act like an unreliable source!

You Need to Have a Back-and-Forth Conversation

ChatGPT’s ability to remember the entire chat history and use it as context to interpret subsequent prompts is one of its most powerful features. It also means that you can iterate what you want based on its responses to you.

Instead of just hitting the “Regenerate Response” button hoping to get a better reply to your prompt, you can give a new prompt that builds on what’s already happened in the thread. Here are some examples of follow-up prompts:

Really, you can modify and transform ChatGPT’s outputs in just aboutany way that can be expressed as language, so take the time to have a back-and-forth with the software as if you’re collaborating with another person, and you can quickly refine its output to exactly what you need.