She Was the Teacher, Not the Pupil
How a (kind of gross) mistake led to me building a whole new product-changing feature
I’m building a personalised storybook for grandparents — a product where someone shares memories about a grandparent’s life and gets back a custom illustrated book. One of my first testers told me she loved her book. The illustrations were beautiful. The story captured the feeling of her life. But there was a problem.
She’d told the story of meeting her partner at school — they were both teachers. When the AI illustrated that scene, it drew her as a pupil. A child sitting at a desk, not an adult standing in front of a classroom.
She had said she was a teacher. It was right there in the text. But somewhere between her words and the final image, the AI made an assumption. School plus meeting someone equals young. It pattern-matched instead of understanding.
And that’s the thing about building with AI — it doesn’t have context the way people do. When someone says “we met at school,” a friend would ask “oh, were you studying there?” An AI just picks the most statistically likely interpretation and moves on. It doesn’t know to pause. It doesn’t know what it doesn’t know.
I’ve spent months trying to make the personalised grandparent book pipeline produce perfect results every time. Better prompts, more detailed instructions, validation steps, retry logic. And I’ve made it much better. But perfect? No. Because people’s lives are beautifully, irreducibly messy. A grandmother who was a rock climber. A grandfather who wore Hawaiian shirts every day of his life. A couple who met at a protest in 1968. These aren’t edge cases — they’re the whole point of a family memory book. The details that make a life story worth telling are exactly the details that AI is most likely to flatten into something generic.
This realisation scared me for a while. I’d been avoiding building an edit function because it felt like admitting defeat — admitting that the AI couldn’t do it alone. But that tester’s frustration reframed it completely. The edit function isn’t a concession. It’s the product working as it should. A person shares their story, the AI creates a first draft of something beautiful, and then the person refines it — correcting the things only they could know.
What made this click was thinking about who actually orders a personalised storybook for grandparents and grandchildren. Many of them are in their sixties and seventies. They didn’t grow up with AI. They’re not going to write a better prompt or learn to engineer their way to a perfect output. But they absolutely know that they weren’t a schoolgirl when they met their husband. They know what they were wearing. They know the colour of the kitchen where Sunday dinners happened. The edit function gives them a way to bring that knowledge into the book without needing any technical skill.
The broader lesson, I think, applies to anyone building with AI right now: stop trying to make the machine perfect. Instead, make it easy for people to fix what the machine gets wrong. That’s where the magic actually is — not in a flawless first output, but in a collaboration between human memory and artificial creativity.
That’s exactly what I’m building at Memolio — and that one piece of user feedback changed everything about how I think about the product.
