On being certain
The title “On Being Certain: Believing You Are Right Even When You’re Not” grabbed me right away. But I have to admit, I found neurologist Robert A. Burton’s book pretty boring. But that doesn’t make the book any less interesting — I’ll summarize it here so you don’t have to suffer...
BOOKSREVIEWSNEUROSCIENCE
Ligia Fascioni
4/21/20253 min read
Boring, but interesting
The title “On Being Certain: Believing You Are Right Even When You’re Not” grabbed me right away. But I have to admit, I found neurologist Robert A. Burton’s book pretty boring.
You know those books that could’ve just been a great article, but have to stretch to 250 pages, so they end up packed with endless stories? Yeah, that. It does test your patience a bit, but it doesn’t take away from the value of the content.
So, let’s get to it!
The sense of knowing
The author starts by explaining that the feeling we “know” something—just like the feelings of familiarity or strangeness—doesn’t fit into the usual mental states neurologists study, like emotions, moods, or thoughts.
The sense of knowing and its cousins (familiarity, weirdness) are actually part of a different kind of mental activity.
They’re part of an internal monitoring system that lets us access our thoughts and judgments. Wild, right?
He uses an analogy: our bodies have all sorts of sensors—like sight and hearing—to perceive the world around us. In the same way, we have a set of sensors to access our inner world.
Here’s how it works: when your body needs food, the hunger sensor kicks in. When you’re dehydrated, the thirst sensor takes over. So, we’ve got sensors for the outside world, but also a whole set for internal use.
In the same vein, Burton says we have a sensory system that tells our minds what we’re thinking.
For example, to encourage learning, the brain needs to feel like it’s on the right track—that what it’s learning is correct. We also develop reward and encouragement mechanisms for thoughts that haven’t been tested yet, but might be useful down the road.
For the reward to really motivate our brains to think and learn, some sensations—like what he calls the “feeling of knowing” and the “feeling of conviction”—need to seem like conscious, deliberate conclusions, even though they’re not. As a result, the brain creates a constellation of mental sensations that look like rational thoughts, but actually aren’t.
These involuntary, uncontrollable feelings are really just mental sensations—ones that are just as prone to perceptual illusions as any other sensory system (think about how easy it is to fool your eyes or ears; why would it be any different with the other sensors?).
The certainty committee
Burton uses a fun metaphor: it’s like every neural network inside another neural network is a member of a giant committee.
When a question comes up, every member gets a vote, and they’re all tallied up to reach a conclusion. Now imagine each committee member represents a mental sensation—from the feeling of knowing to familiarity, weirdness, strangeness, or reality. They all weigh in on the final decision about a thought, including whether it’s right or wrong.
Notice there’s no “feeling of understanding” vote. That’s because the member in charge of familiarity—“yep, I know this, I’ve seen it before”—carries a ton of weight. Along with its buddy “sense of conviction,” they bulldoze the rest (unless something is really, really bizarre).
The result: you get the feeling you’re right, even without enough evidence, because the analysis wasn’t rational—it was emotional.
What to do?
According to the neuroscientist, this feeling of certainty isn’t something conscious you can control or turn down at will. What you can do is consciously introduce new, opposing information; then the values and weights can be reevaluated. That’s why hearing the other side is so important.
The message is that feelings of knowing, being right, conviction, and certainty aren’t rational conclusions or conscious choices. They’re mental sensations, based more on familiarity and the comfort of what we know than on logical reasoning.
For Robert, the standard definition of KNOWING—directly perceiving, grasping information in the mind with clarity and certainty, considering the truth beyond doubt—doesn’t hold up when you look at what we know about how the brain works.
He suggests we should swap out the word “know” for “believe” everywhere, even in science.
Instead of saying you’re sure a cause led to an effect, it’d be better to say, “I believe this cause led to that effect, given the evidence.”
That wouldn’t deny scientific knowledge at all, just recognize its limits. Saying “I believe” would constantly remind us of the limits of knowledge and objectivity. But let’s be honest, that idea would never catch on, right?
As the author says, certainty is biologically impossible; we need to learn (and teach our kids) to tolerate the discomfort of uncertainty. Science already has the language and the tool for this—it’s called probability.
To wrap up, he drops a great quote from Nobel Prize-winning physicist David Gross:
“The most important product of knowledge is ignorance.”
So true. Socrates was saying this ages ago—he knew that he knew nothing…

