In the technology world, there’s a common saying that goes “garbage in, garbage out”.
In essence it means that if you provide bad data you get poor results.
In AI, getting a great result heavily depends on asking the right questions. So…”poor questions equal poor answers”. That’s what prompt engineering is all about.
Excelling at prompt engineering means asking better prompts (or “input”) to get better “outputs” (results).
Why does it matter?
If you can master the art of prompt engineering then you could earn $300,000 plus a year. No engineering degree required.
What is prompt engineering?
According to CMS Wire, prompt engineering “…involves the strategic crafting of input text prompts to guide the AI model’s output toward a desired outcome. This process is essential because the quality and specific text used in the input prompt can significantly influence the AI’s generated content”.
What are other people saying?
The Economist says “Writing effective prompts for AI is much simpler than, for example, mastering a programming language. But making the most of the algorithms within these black boxes has become harder. “Prompt engineering”, has been likened to guiding a dance partner or poking a beast to see how it will respond”.
Lance Elliot in Forbes had this to say about prompt engineering. “The use of generative AI can altogether succeed or fail based on the prompt that you enter. If you provide a prompt that is poorly composed, the odds are that the generative AI will wander all over the map and you won’t get anything demonstrative related to your inquiry”
Sascha Heyer in Computer Weekly stated that “While OpenAI recently launched a prompt engineering guide to support users, this process doesn’t come without complexities. Prompt engineering is skilled work that requires precise wording, formatting and structuring to achieve the desired results. While it’s an art rather than an exact science, well-crafted prompts significantly enhance LLM performance. Different prompting techniques use different approaches and therefore, provide different answers”.
Here are the 4 types of prompts that he said you need to master:
- Input Output Prompting – when an answer is given in response to an input.
- Chain of Thought (CoT) Prompting – when the model is asked how it reached an answer.
- Self-Consistency with CoT – when the engineer finds the most consistent answer from the CoT method.
- Tree of Thoughts – when multiple CoT prompts are provided and used by the model to find the most effective answer.
Going deeper
OpenAI released a guide for the best practices for prompt engineering. And they listed 6 strategies for getting better results. Here is a quick summary of what they suggest:
1. Write clear instructions
Don’t try and make ChatGPT guess the answer, write the clearest question and make it very specific.
- Worst Example: Who’s president? Better Example: Who was the president of Mexico in 2021, and how frequently are elections held?
- If you want it to adopt a persona such as humorous or serious or skeptical then make sure that is included in the prompt.
- If you want a 1,000 word article then make sure you request it in the prompt.
2. Provide reference text
If you can provide a model with trusted information that is relevant to the current query, then you can instruct the model to use the provided information to compose its answer.
- Example: “Use the provided articles delimited by triple quotes to answer questions. If the answer cannot be found in the articles, write “I could not find an answer.”
3. Split complex tasks into simpler sub-tasks
You may have one big task or question such as How to Write a Book in One day with ChatGPT on the topic of “How to be happy”.
- Ask ChatGPT to write an outline for your book and then ask it to write an introduction (or overview) for each of those sub-topics as separate prompts.
4. Give the model time to “think”
If asked to multiply 17 by 28, you might not know it instantly, but can still work it out with time.
Similarly, models make more reasoning errors when trying to answer right away, rather than taking time to work out an answer.
Asking for a “chain of thought” before an answer can help the model reason its way toward correct answers more reliably.
- Tactics:
- Instruct the model to work out its own solution before rushing to a conclusion
- Use inner monologue or a sequence of queries to hide the model’s reasoning process
- Ask the model if it missed anything on previous passes
5. Use external tools
Compensate for the weaknesses of the model by feeding it the outputs of other tools.
For example, a text retrieval system can tell the model about relevant documents. A code execution engine like OpenAI’s Code Interpreter can help the model do math and run code. If a task can be done more reliably or efficiently by a tool rather than by a language model, offload it to get the best of both.
- Tactics
6. Test changes systematically
Improving performance is easier if you can measure it. In some cases a modification to a prompt will achieve better performance on a few isolated examples but lead to worse overall performance on a more representative set of examples.
Where can you find resources and training for prompt engineering?
Knowledge wants to be free and there are many free courses for learning prompt engineering.
Here are a few to checkout:
ChatGPT Prompt Engineering for Developers
ChatGPT has provided a free course for developers in collaboration with DeepLearing.ai, that is broken down into 9 video chapters.
You’ll learn two key principles for writing effective prompts, how to systematically engineer good prompts, and also learn to build a custom chatbot.
All concepts are illustrated with numerous examples, which you can play with directly in the Jupyter notebook environment to get hands-on experience with prompt engineering.
These include:
- An Introduction to Prompt Engineering
- Guidelines
- Iterative
- Summarizing
- Inferring
- Transforming
- Expanding
- Chatbot
- The Conclusion
Prompt Engineering for ChatGPT (Coursera) Via VanderBilt University
Developed by Dr. Jules White, this Prompt Engineering course (which has over 214,000 students enrolled), will help you discover how to use AI as a tool to unlock human creativity.
Through this course, you can learn to create applications that can, say, help children learn math through a Pokemon-inspired game. Also, it shows the limitless possibilities of integrating AI into everyday life.
The course covers all the topics in 30 videos and also provides online learning material that you can purchase. Although the course is free, the resources have a fee.
Prompt Engineering and Advanced ChatGPT by edX
This course has over 1 million students already enrolled. In this course, you will delve into the advanced techniques that can be used to prompt ChatGPT, allowing it to generate responses that are even more accurate, relevant, and engaging.
They explore the different applications of ChatGPT across various industries, including healthcare, finance, education, and customer service, among others.
Last words
The introduction of ChatGPT in late 2022 significantly accelerated changes in various industries. This rapid evolution has caused concern among creators, writers, and businesses who fear being overrun by this new technology.
But it also offers many upsides; new careers, increased productivity, and entrepreneurial opportunities.
Learning new skills is required and “prompt engineering” is one of those skills. Ignoring change is a death wish for careers and businesses in this AI technology revolution.
The new reality is that you need to lean in or be run over. Not adapting is not an option.
The dinosaurs learned that lesson too late.