From the course: LLMOps in Practice: A Deep Dive
Prompting - Python Tutorial
From the course: LLMOps in Practice: A Deep Dive
Prompting
Previously, you learned about large language models and began to understand how they are strong at completing text. Naturally, the next thing to do is to give them text in order for them to complete it. However, modern large language models can do much more than just a basic autocompletion of texts, and getting them to behave the way that you want becomes an art and a science. We'll explore that in the next few videos, with the goal being, whether you're an absolute beginner or an experienced pro, will help you get the most out of your interactions with AI. I'm going to cover the basics of prompting and how it works. So let's dive in. First things first. What exactly is prompting? In the context of interacting with LLMs, prompting is the art of providing an input or a set of instructions to the model to generate a response. Think of it as asking a question or giving a command to a very knowledgeable assistant. For example, if you prompt with, tell me a joke, the model processes your request and generates a joke in response. It's really that simple. But as we'll see, prompting can be much more powerful and nuanced. So why is prompting so important? Well, the quality of the response depends heavily on the quality of your prompt. A clear, well-structured prompt can yield a precise and useful answer, while a vague or poorly worded prompt might not get you the information that you need. So let's break down the components of a good prompt. You should be clear, and you should use straightforward language to avoid confusion. You should be specific; provide as much detail as is necessary. You should be contextual; give the model some context to understand your request better. You should be goal-oriented and be clear about what you want to achieve with the response. So here's an example. Consider this vague prompt, help me write an essay. The model might not know where to start or what aspect of history you're interested in writing about, and you'll get a response like this. But how about a more specific prompt? Help me write a history essay about the important events in the Industrial Revolution. This prompt gives the model clear instructions and context, making it more likely to provide a useful response. And you can see when I tried this, I got a very detailed response, and it's much more than I can even fit on this slide. So now let's take a look at some basic prompting techniques. You can use simple queries, instructions, or questions to interact with a model. A simple query is a direct request for information or actions. An instruction is a command for the model to perform specific tasks. A question is an inquiry that requires a detailed response. Pause the video for a moment and then try a simple query, an instruction, and a question in your favorite chatbot. So for a question, and this is a true story, one day I was out running with a friend in Tokyo, and this animal came out of the underbrush near a park. We had no idea what it was, and to me, it actually looked like a red panda, which looks like this. But it couldn't be a red panda unless it had escaped from a zoo. But I could ask my favorite chatbot a question like, what is the animal that looks like a red panda but can be found in cities like Tokyo? And I got the answer of a creature called a tanuki. And here's a picture of one. Now, maybe that's what I saw, but I could swear the one that we encountered was a little bit more red in color. ChatGPT's description said that Japanese folklore mentions that these creatures have shapeshifting abilities, so maybe. Anyway, what did you try? Please share your experiences with us. Regardless of whether you're inquiring, instructing, or questioning, here's some good tips. Be explicit. Clearly state what you want the model to do. Instead of saying something like, tell me something interesting, you could say, tell me an interesting fact about space exploration. In my case, I was as explicit as I could be describing the animal reminding me of a red panda. Also, you should use complete sentences when you can and this does really help reduce ambiguity. Don't be afraid to iterate and refine. If the model's response isn't quite what you wanted, refine your prompt and just try again. Iteration is the key to improving the quality of responses. You should set constraints. Sometimes adding constraints can help you get more accurate answers. For example, list three benefits of exercise for mental health is much more focused than just benefits of exercise. Look at the case that I shared. An important constraint was that the mysterious creature I saw was in Tokyo. And of course, don't be afraid to ask follow-up questions. If you need more information, just use follow-up prompts. For instance, after getting a summary of an article, you might ask, can you explain the main argument in a bit more detail? Let's walk through an example of iterating and refining a prompt. Suppose I ask, explain climate change. The response is very broad and it might not contain the details that I need. So if I refine my prompt to explain the main causes of climate change in simple terms, this refined prompt guides the model to provide a more targeted and understandable answer. Remember, the model is artificially intelligent. It simulates intelligence by artificially understanding information and repeating things back to you. You cannot assume that it understands what you're asking, and although it's artificially very strong at getting the sentiment from your prompts, it's still artificial. It will make mistakes. So being super clear can help avoid this. And in that way, it's a bit like coding. And just like coding, it's important to be aware of common mistakes in prompting to avoid them. Don't be vague. Avoid prompts that are too broad or unclear. Don't overload by including too many requests in a single prompt. Break it down into multiple prompts if necessary. Challenge your assumptions and don't assume the model has the same context or knowledge as you do. Provide necessary details in your prompt. Don't neglect context. Provide as much detail as you can. For example, instead of a vague prompt like tell me about science, a more effective prompt would be, tell me about the scientific method and its importance in research. This provides the AI with a clear topic and the context needed to generate a useful response. Pause the video and try these out for yourself to see the difference in output. To recap, prompting is a powerful tool for interacting with artificially intelligent models like Gemini or GPT. By crafting clear, specific, contextual, and goal-oriented prompts, you can significantly improve the quality of your responses. In our next video, we're going to explore how to use system prompts to specify a role for these purposes.