Act Promptly: AI Best Practice in Schools
- Mark Fraser
- Mar 20
- 4 min read
How teachers can help their students to use AI prompts more effectively

You’ve got to feel for careers teachers. In the old days, the worst part was letting someone down gently and suggesting they might not get to play for Man United. Have they considered being a car mechanic?’ At least you had a pretty good idea what their fanciful careers meant.
‘It’s always been my dream to be a metaverse architect!’ ‘Virtual Influencer Coordinator’? Anyone? No, nor me - had to look it up and still not sure which part of the job is virtual.
AI Prompt Engineer is another that definitely falls into the category of jobs that didn’t exist five years ago - and might not in another three. In case you don’t know, these professionals specialize in crafting effective and specific prompts to elicit desired outputs from AI models.
It sounds complicated and important doesn’t it and, on some levels, it is. If you’re working on training the models, I get that it helps to be precise and efficient. But I also think it contributes to the mystification, deliberate or otherwise, of AI and generative AI in particular.
As far as teachers are concerned, when we’re thinking about how to teach our students how to use Gen AI effectively (and we should DEFINITELY be doing that, by the way) it pays to be a bit less ‘precious’ about it. Be more playful!
Instead of worrying about crafting the perfect prompt, better to use it interactively. Just ask for something. Then ask it to modify its output based on what it gives you.
That’s not to say that the prompt isn’t important. As any of those motivational posters in a gym will tell you, what you get out does depend on what you put in. But you don’t need to be perfect straight away. Actually, there’s probably a poster for that, too.
I tried this with a Year 11 student this week. She was struggling with ‘Romeo and Juliet’ and had been asked to write about Act 1 Scene 2. I suggested she ask Gemini. I wasn’t so interested in the response - I know the play well enough - but I was interested in the way she approached the process.
Her first question was pretty typical of the prompts students use. She asked, somewhat plaintively, I thought: ‘Explain this to me.’ And then she attached a pdf of the extract.
I have to say the output was pretty good. She was delighted and, on her own, she might have left it there. Clearly, it helped her and, as she re-read the text with the AI assistance beside her, it was evident from her expression that she was starting to understand it.
Apart from this weird bit of (I believe) Bangla that crept in somehow!

Apparently, it translates to ‘conditions’. At least it gave me an opportunity to mention that AI does get things wrong - everything’s a teachable moment, right?
We didn’t leave it there, though. Instead, we put the same question into Claude and ChatGPT and looked for the differences.
This simple comparison is good practice for two reasons.
First, it reinforces the fact that AI doesn’t provide the answer, but only an answer. An answer that has to be considered, scrutinised and challenged, not merely accepted.
Secondly, it’s a very low-stakes way of introducing some comparative and evaluative skills. Instantly, we’re scaling the heights of Bloom’s Taxonomy. Happy days!
In this case, my student noticed that Gemini and Claude both emphasise that Lord Capulet’s other children have all died, something that ChatGPT misses completely. The point here is not that GPT is less good - rather, that the comparison opened up a discussion about why this fact might become important later in the play.
We then went on to refine the prompt by providing some context. I find this is a very effective way of generating a more useful response. This time she asked:
I am a detail-oriented Year 11 UK student, skilled in thoroughly completing assignments and seeking to understand assigned content to the best of my ability. I am reading Romeo and Juliet by Shakespeare for GCSE English Literature. Explain this to me.
Instantly, all three bots gave much more detailed assistance, tending to focus on particular lines and explaining their meaning in quite a literal sense. The accuracy was pretty impressive. I’ll share links to the complete chats here so you can see the difference for yourself.
Then we tried another prompt, this time asking for a slightly more specific response.
I am a detail-oriented Year 11 UK student, skilled in thoroughly completing assignments and seeking to understand assigned content to the best of my ability. I am reading Romeo and Juliet by Shakespeare for GCSE English Literature. Your task is to help me understand what this passage from Act 1 Scene 2 means in the context of the play.
This shifted the focus of all three responses quite significantly and we started to get information about character development and the overall narrative structure.
Finally, we built on this start and asked a different but related question:
How can we relate these ideas to the context of Shakespeare’s other tragedies?
Again, the responses were impressive and inspired a much wider discussion about tragedy and genre than we might have had otherwise.
So, three tips that I think are useful when helping students to prompt AI effectively:
Provide some context
Be specific
Build on the conversation
But the most important takeaway, I think, is that you don’t have to get it right straight away. We have to see the use of AI as a process not a solution and we have to frame it in those terms for our students.
If we define it as a useful tool to help them to develop their understanding, rather than an illicit cheating machine, we’ll be on the road to developing proper AI literacy - and that’s what we should be aiming for.
Comments