The Six Honest Serving-Men and AI Literacy
If you want to use AI well, start with better questions.
Rudyard Kipling published The Elephant's Child in 1902 as part of his Just So Stories. Tucked into the tale is a short poem that has quietly outlasted the century:
I keep six honest serving-men (They taught me all I knew); Their names are What and Why and When And How and Where and Who.
Six words. Endless applications. And as it turns out, they map almost perfectly onto the challenge of using artificial intelligence well.
AI Responds to Questions, Not Wishes
There is a common misconception about AI tools: that they somehow "know" what you need. They do not. They respond to what you ask — and the gap between what you mean and what you type is often where results fall apart.
Most weak AI outputs trace back to weak prompts. Consider the difference between these two requests:
Poor prompt: "Create a lesson about recycling."
Stronger prompt, built on the six serving-men:
- What: A reading comprehension worksheet about recycling
- Who: Adult A2-level English learners
- Where: An online class based in Argentina
- When: A 60-minute session
- How: Include a 120-word reading text, 10 vocabulary items, and 5 comprehension questions
- Why: To practice the simple present tense and introduce environmental vocabulary
The second version does not just describe a topic — it constructs a context. The AI now has enough information to generate something genuinely useful, not just plausible. Structure improves output quality immediately, and it dramatically reduces editing time afterward.
The Risk of "Ten Million Serving-Men"
Kipling's poem has a second, less-quoted stanza. In it, the speaker describes someone else — someone far less disciplined — who keeps millions of questions running constantly, never resting, never stopping.
That image feels uncomfortably familiar in the age of AI.
With generative tools, it is easy to:
- Produce twenty versions of the same idea without deciding which direction you actually want
- Keep refining a prompt indefinitely, chasing a slightly better output
- Ask "why" questions that lead to more "why" questions that lead nowhere in particular
- Overproduce content until the original goal disappears into the noise
More output is not better thinking. Without clear boundaries, AI becomes a generator of distraction rather than a tool for clarity. The ability to produce is not the same as the ability to decide.
The "Rest from Nine Till Five" Principle
The poem's speaker gives the serving-men rest. That line is easy to skip over, but it carries real weight.
Knowing when to stop asking is as important as knowing how to ask in the first place. Applied to AI use, this means:
- Define the task before you open the tool. Know what you need before you start generating.
- Ask focused, scoped questions. One clear prompt beats five vague ones.
- Stop when the objective is met. Resist the pull toward endless iteration.
- Reflect before generating more. Ask whether the next output will actually improve the work, or just add to the pile.
Discipline is not a constraint on creativity. It is what keeps creativity pointed at something useful.
AI Literacy Is Question Literacy
Many discussions of AI literacy focus on tools — which platforms to use, how interfaces work, what features are available. These things matter, but they are secondary.
Real AI literacy is the ability to ask well. Before the AI ever processes a prompt, the user must already have answered:
- What exactly do I need?
- Why am I doing this?
- Who is this for?
- How should the output be structured?
- Where and when will it be used?
These are not AI questions. They are thinking questions. The AI just makes the quality of that thinking visible, quickly and at scale.
For educators, this reframing is important. Teaching students to use AI well is not a technical skill — it is a thinking skill. And thinking skills transfer across tools, platforms, and years.
When students learn to ask better questions, they write better prompts. When they write better prompts, they engage more critically with the output. When they engage critically with the output, they stop being passive consumers of generated content and start being active thinkers who happen to have a powerful tool available.
A Classroom Application
Here is a simple activity that makes this visible in practice:
- Give students a deliberately vague AI prompt — something like "Write something about climate change."
- Ask them to improve it using the six questions: What, Who, Where, When, How, and Why.
- Run both prompts and compare the outputs side by side.
- Discuss what changed, what improved, and why structure made the difference.
The conversation that follows is usually rich. Students notice that the AI did not get "smarter" — the question got clearer. That distinction is the insight worth holding onto.
Final Thought
Kipling wrote his six serving-men as tools for curiosity — a child's way of understanding the world. Over a century later, they are just as useful as a framework for human-AI collaboration.
Teaching students to master What, Where, When, How, Why, and Who is not really about AI at all. It is about structured thinking — the kind that makes any tool more powerful, any communication more precise, and any work more intentional.
And structured thinking, as Kipling might have put it, always outlasts the technology.
This post is part of an ongoing series on AI literacy in language education.
About the Author
Dr. Doris Molero is the founder of E-Language Center. She integrates AI, storytelling, and structured thinking into online English classes to help learners build confidence and communicate effectively.

Comments
Post a Comment