How Futurewhiz uses AI in education: behind the scenes
Nataliia
AI in education is everywhere right now.
There’s excitement. There’s skepticism. And there are a lot of assumptions about what it actually means in practice. But at Futurewhiz, we’re interested in what it can actually do for children and what it can’t.
We sat down with four members of our education content team, Lotte, Verine, Renske, and Judith, to find out how AI has changed their work and to understand what’s really happening behind the scenes.
The reality is a lot more human than you might expect.
It starts with the prompt and the person behind it
Ask anyone on our team what the biggest misconception about their job is, and the answer is surprisingly consistent: people think AI does the work. Well…it doesn’t.
“There’s a big part of human work behind it,” says Lotte, who creates history content for secondary education. “AI makes us faster, but quality is everything. I’m constantly sculpting the prompt; it’s much more like shaping something than generating it.”
That idea, sculpting rather than generating, captures how the team works with AI.
Prompts are carefully designed frameworks that include:
- The student’s level
- Learning objectives
- Required prior knowledge
- Cognitive load and scaffolding
- Question types and structure
- Alignment with specific school curricula
Verine, who works across content generation for StudyGo and Squla, agrees. “The quality of what AI produces depends entirely on what you put in. If your prompt is rooted in expertise, with clear boundaries and good examples, you get something valuable back. If not, you don’t.”
AI, in that sense, doesn’t replace expertise; it depends on it.
The real work happens before and after AI
AI speeds up content creation, but it doesn’t remove the need for careful thinking or careful checking.
“It definitely boosts our capacity,” says Verine. “But everything still needs to be checked.”
And it is: every piece of content that ends up on the platform is reviewed by at least two people before it reaches a student.
Renske explains: “We analyse the output, identify what’s off, and refine from there. It’s a constant process of iteration.”
That process is especially important in subjects like math and physics, where accuracy is critical.
“With more complex or number-based content, you can’t rely on it blindly,” Verine adds. “It can hallucinate, so we still do a lot manually.”
What doesn’t work and why that matters
The team is open about failure. In fact, a large part of the process is trying things that don’t work.
“A lot of what we generate isn’t usable,” says Judith.
Some outputs lack creativity. Others lack accuracy. And sometimes, things just go completely off track.
“I once tried to generate economics questions and got restaurant recommendations for Amsterdam,” Renske recalls.
Even AI audio experiments have produced unexpected results, from strange filler sounds to completely unusable recordings.
These moments aren’t just funny: they’re a reminder that AI still needs human judgement and why every piece of content is carefully reviewed.
AI is changing how we think about learning
One of the most promising AI use cases in education is personalised learning.
“We think much more about scaffolding, cognitive load, and how to check real understanding,” says Renske.
AI makes it easier to experiment and refine how content supports learning.
One example is the AI tutor, which guides students through questions instead of giving direct answers.
“Students have to explain their reasoning,” says Verine. “It creates a more active learning process.”
This shift moves learning away from memorisation and toward deeper understanding.
What AI cannot replace in education
For all its potential, the team is clear about one thing: AI is a tool, not a replacement.
“The teacher-student relationship and social interaction are irreplaceable,” says Lotte.
Judith agrees: “Students need a personal connection. That sense of belonging that AI can’t create.”
There are also broader considerations around ethics, bias, and sustainability.
AI can sometimes agree too easily with users instead of challenging them, while real learning often requires critical thinking and feedback. This is why responsible AI use in education remains essential.
A realistic view of AI in EdTech
Many people assume EdTech teams are fully embracing AI without hesitation.
In reality, the perspective is more balanced.
“I was really sceptical at the start,” Lotte admits. “I still am, in some ways. But AI can know things that no single person can know. Used carefully, it’s a genuine boost, a brainstorming partner, a way to move faster without cutting corners.”
Judith raises the sustainability question directly: “My 2020 self would have serious concerns about how much energy AI consumes. That’s something I think about, and something our industry needs to take seriously.”
Today, AI is seen as:
- A brainstorming partner
- A way to scale content
- A tool to support personalised learning
But always with clear boundaries.
What this means for students
Over 1 million students use Futurewhiz platforms every year. Every question they answer, every piece of content they interact with, has been shaped and scrutinised by people like Lotte, Verine, Renske, and Judith.
AI has made them faster. It has deepened their understanding of how children learn. It has occasionally recommended restaurants when asked about economics.
But the judgment, the care, and the responsibility for what ends up in front of a child? That remains entirely human at Futurewhiz.
Because learning means growth, and growth takes people who care.