The 7 Best Prompts for Chat-GPT

Alana Grace

a square object with a knot on it
Photo by Mariia Shalabaieva on Unsplash

There are many tips for creating optimal inputs in an AI chatbot: “Write clearly and precisely, use synonyms.” These are some common suggestions. Referring to the bot as an ‘expert’ is another. All of these promise improved responses from artificial intelligence. And indeed, the output of Chat-GPT & Co. heavily relies on your input. That’s why we’ve experimented with the most common tips for Chat-GPT and compiled the best ones for you here.

The input in a chatbot is often referred to as a prompt. This term suits it well because it can encompass questions, tasks, or any other text input. Before delving into the best prompts, let’s briefly introduce the most notable AI chatbots.

Which AI chatbot is the best?

You have several options when it comes to chatbots capable of generating text. The most prominent ones include OpenAI’s Chat-GPT, Microsoft’s Bing Chat, and Google’s Bard. While AI tools like Jasper Chat and Chatflash are more business-oriented due to cost considerations, Google’s Bard is efficient in responding quickly and accurately, now supporting German as well.

ChatGPT: What AI can do (and what it cannot)

Microsoft’s Bing Chat requires the Edge browser. Launch Edge, click on the blue “b” in the upper right corner, or visit While Bing Bot is based on Chat-GPT from OpenAI, its configuration is distinct. It often provides answers with accompanying website links, allowing you to evaluate the answer’s accuracy. However, this approach also restricts the variety of Bing Bot’s responses.

To address this limitation, the bot offers a “creative” mode. Nonetheless, based on our experience, the answers in this mode aren’t as compelling as those from Chat-GPT by OpenAI. Therefore, we suggest using the Bing Bot primarily when you want a website source as part of the answer.

Microsoft’s AI chatbot also offers the “creative” mode. In this mode, it refrains from sending links to websites as supporting evidence for its answers. However, it doesn’t exhibit the same level of creativity as the original Chat-GPT.

Chat-GPT: This bot has created waves in the AI field since the end of last year. You can use it for free at The tool operates using its version 3.5, comprehending complex questions and commands, often providing competent answers, although not always accurate.

For a more advanced version 4.0, you can subscribe to “Chat-GPT Plus” for nearly $24 per month. This version offers enhanced capabilities, particularly in dealing with intricate topics. It also lifts restrictions on text input and output volume. Moreover, the paid version provides plug-ins to enhance AI capabilities, with the range of plug-ins rapidly expanding.

In our experience, you can achieve good results with the free Chat-GPT 3.5, making it a recommended choice for all inquiries. The additional benefits of version 4.0 are more appealing to professional users.

In general, our prompt recommendations apply to all language models. However, we mainly focus on Chat-GPT and have tested our suggestions within that context.

Context, Context, and Context Again

Most prompt tips on the internet, including the following ones, revolve around the concept of “context.” Here’s a brief explanation:

Large language models, including the mentioned bots, primarily work with statistical probabilities of word sequences. These tools predict the statistically likely next word in a response. This prediction is based on the vast amounts of text they were initially trained on. The likelihood of words appearing in a text heavily relies on the context.

When discussing food, the context differs whether the answer is for a home cook or a food chemist. Hence, the more context you provide Chat-GPT, the better it can generate accurate output.

OpenAI also provides prompt suggestions on its website. However, these tips are more tailored for API users.

OpenAI also provides prompt suggestions on its website. However, these tips are mainly aimed at API users—i.e., programmers integrating Chat-GPT into their applications.

Tip 1: Assign a Role to Chat-GPT

You’ll receive better answers from Chat-GPT by assigning it a role first. Begin the prompt with:

You are an [expert/blog author/marketing expert/chef/trainer], and you need to [task].

You can vary this formulation; for instance, it could also be:

As a [hobbyist/consultant/enthusiast], you should [task].

The word “expert” has a strong signaling effect. According to user reports, the term “consultant” also has a significant impact when assigning a role. Instead of “role,” some tips refer to “persona,” which conveys the same meaning. Following this sentence, you proceed with the actual task. In our example, it could look like this:

You are a marketing expert, and you need to create a compelling advertisement for a new product launch.

To improve results further, consider adding additional information to your prompt. In this instance, you could specify how many kilometers you currently cover per day or week.

Of course, Chat-GPT doesn’t understand this instruction as a person would. It doesn’t genuinely take on the role of a blog author, marketing expert, chef, or trainer. However, in its response, it considers the knowledge it has learned in these areas to a greater extent.

Tip 2: Explicitly Ask Chat-GPT for Help

This prompt begins with the question:

Can you help me [goal]? You can ask me as many questions as needed to assist me.

Chat-GPT will present you with a list of steps to accomplish the mentioned goal. If you require AI assistance for any of the tasks, request the tool’s help with that particular step.


Can you help me plan a healthy and balanced weekly meal plan?

 You can ask me as many questions as needed to assist me.

The AI will then pose several questions, usually numbered, to which you can provide answers in the form:

Q1: [Your answer]
Q2: [Your answer]

After you submit these answers, Chat-GPT will offer specific instructions to achieve your goal.

Tip 3: Let Chat-GPT Ask You Questions

This strategy is built on an intriguing assumption: Chat-GPT was designed for conversations. Instead of crafting a perfect prompt to yield an ideal answer, it’s often better to engage Chat-GPT in a conversation to achieve the best outcome.

This is accomplished with the prompt:

Can you help me achieve [goal]? You can ask me as many questions as you need to help me.

With this formulation, the AI proceeds to ask you several questions, typically numbered. You can respond to them in the format:

Q1: [Your answer]
Q2: [Your answer]

After providing these answers, Chat-GPT will guide you with specific steps to attain your goal.


Can you help me write an engaging blog post about sustainable living? You can ask me as many questions as you need to help me.

Tip 4: Context, Instruction, Details, Input

The CIDI approach comes from AI specialist Gianluca Mauro. It stands for Context, Instruction, Details, Input, and complements previous tips.

Mauro elucidates his approach with this example: Drafting a complaint email to an airline due to a flight delay. A simple prompt could be:

Write a complaint email to an airline because a flight was delayed.

However, the Chat-GPT response is adequate but can be notably enhanced with CIDI. Mauro rephrases it as follows:

Compose an email of complaint to an airline explaining the dissatisfaction with a delayed flight. Include details about the flight, departure and arrival times, and any inconveniences caused. Express the frustration and request suitable compensation for the inconvenience.

In your CIDI-based prompt, input the data in quotation marks, all within one prompt. Indeed, the response to the CIDI example sounds more formal. However, in our attempt, the answer seemed to be crafted by a legal professional. As a non-lawyer, you wouldn’t send it as-is. Yet, with a few adjustments, you can obtain a usable email.

Mauro also emphasizes that Chat-GPT input doesn’t need to be full sentences. You can work with keywords, as he demonstrated under “Input.”

Tip 5: Provide as Much Input as Possible

An AI chatbot’s text-processing capacity is limited, quantified in tokens. You can utilize the tool at to determine the token count in a text.

The recommendation to employ “mega prompts” suggests using a significant amount of text with detailed information in the prompt. While this essentially aligns with the CIDI model, it lacks specific content.

In our view, the term “mega prompt” is memorable. It conveys the idea that many Chat-GPT tasks benefit from substantial preliminary information.

However, there are technical boundaries to the text length an answer can have. The free version of Chat-GPT has a limit of 4097 tokens. Chat-GPT Plus can manage 16,000 tokens, while even greater input is feasible through API input. OpenAI defines one token as a partial word, akin to a morpheme—the smallest meaningful unit of a word. The exact counting method isn’t explicit. For token count inquiries, the Tokenizer tool at can provide assistance.

Important: The 4097-token limit in the free Chat-GPT version encompasses both the question and the answer. If the question is 4000 tokens long, you’re left with only 97 tokens for the response.

Tip 6: Reuse Contexts

On the left side of Chat-GPT, you’ll find all your recent questions (chats) with the AI. If they’re related thematically, consolidate them within a single chat to supply more context for each question.


If you ask Chat-GPT to create a weekly meal plan and inform it about individual family members’ preferences and restrictions, the AI retains this data within a chat. So, when making new meal plan requests, you don’t need to repeat the preferences and allergies.

However, there’s a 4097-token limit per chat in Chat-GPT. If the chat exceeds this, the oldest information might be disregarded or the chat truncated.

Tip 7: Prompts from Prompt Engineers

For the best prompts for AI, it’s worth exploring the discussion forum of Chat-GPT engineers. They have a dedicated channel on Discord.

On the Discord platform, there’s a channel exclusively for using Chat-GPT. It’s home to many “prompt engineers” who share valuable prompt tips. After accepting the group’s invitation, click on “prompt-engineering” or “prompt-library” on the left side. Here, you can explore English-language tips. However, note that these tips often target users of the Chat-GPT API.

%d bloggers like this: