The Death of Expertise Was a Lie. What Actually Died Was Our Ability to Admit We're Wrong.
When I was seven years old, I had absolute certainty about something that was completely wrong.
We were driving home one evening in eastern Kentucky, probably from the Cedar Knoll Galleria, that newly built shopping center that felt like the height of sophistication. I noticed the moon was following us and pointed this out to my dad with the confidence only a second-grader can muster.
My dad tried to explain. Perspective, he said. Distance. The moon is so far away that it appears to stay in the same position relative to us even as we move.
I was unconvinced. In fact, I was emphatic. The moon was clearly following us.
So my dad pulled over on US-60. "See?" he said. "We stopped moving. The moon stopped moving. It's so far away it's always in the same spot. It just looks like it's moving when we're moving."
I looked at him with pity. "No," I explained carefully. "The moon stopped moving because we stopped moving. Because the moon is following us."
We tell this story at family gatherings now. Everyone laughs. But lately, I've been thinking about that moment differently. Not as a cute story about childhood magical thinking, but as something diagnostic about the moment we're living through.
Because when people talk about the "death of expertise," I think they're describing my seven-year-old self. Someone absolutely certain about something they don't understand, immune to explanation, interpreting evidence to confirm what they already believe.
Except here's what I've realized: The problem wasn't that I couldn't access expertise. My dad was right there, explaining it. The problem was that I couldn't admit I was wrong.
What the Science Actually Shows
A new study from researchers at the University of Adelaide and University of Western Australia just reframed how we should think about misinformation. They gave 244 people a straightforward task: look at 60 news headlines (half true, half false) and figure out which ones were real.
But they also measured something called intellectual humility, essentially your willingness to admit you might be wrong, to consider contradicting evidence, and to recognize the limits of your knowledge.
People who scored higher on intellectual humility were significantly better at identifying both true and false headlines. Not because they were more skeptical of everything, but because they were more accurately calibrated.
As the researchers put it:
"These findings suggest that all the measured aspects of intellectual humility (awareness of one's own limitation, openness to diverse and opposing viewpoints, critical evaluation of evidence, and willingness to update beliefs) may help with the detection of misinformation."
Notice what's not on that list: intelligence. Educational level. Access to information.
What predicted accuracy was character. Specifically, the character trait of being willing to be wrong.
We don't have a failure of education here. We don't have a failure of access to information. We don't even have a failure of intellect. We have a failure of character.
The Metacognitive Mirror
The study revealed something even more interesting about what's called metacognition, which is essentially thinking about your own thinking. It's your ability to step back and evaluate whether you actually understand something or just think you understand it.
Metacognition is noticing your thinking and steering it instead of letting it steer you.
People higher in intellectual humility were better at metacognition too. They had genuine insight into when their own judgments were reliable and when they were fooling themselves.
The researchers verified this wasn't just performance. These people genuinely knew when they were right and when they were wrong. They had what seven-year-old me completely lacked: the ability to recognize the difference between "I understand this" and "I'm actually just making stuff up."
They also found something crucial: intellectually humble people weren't doubting everything equally. They were correctly calibrated, believing true things more and false things less because they were willing to update their beliefs when evidence changed.
What Dies When We Can't Change Our Minds
I keep coming back to my dad pulling over on that highway shoulder. He had expertise. He had patience. He even demonstrated it.
None of it mattered. Because I had something stronger than expertise or evidence. I had certainty.
And this is what we've actually lost. Not expertise. Not information. What we've lost is the cultural capacity for uncertainty. The ability to hold our beliefs lightly enough that evidence can actually change them.
We've spent years building fact-checking infrastructure, media literacy programs, and algorithm reforms. Maybe we've been solving for the wrong variable.
The limiting factor isn't access to accurate information. It's whether we're the kind of people who can look at evidence and think, "Huh. I guess I was wrong about that."
That's not an education problem. That's a character problem.
What Expertise Actually Is
Expertise isn't just knowing a lot of things. Expertise is having a well-calibrated sense of what you know, what you don't know, and what evidence would change your mind.
Expertise is a form of intellectual humility, not opposed to it.
My dad was an expert in that moment because he understood the limits of perception. Seven-year-old me wasn't suffering from lack of information. I was suffering from inability to imagine I could be wrong.
The overconfident amateur who thinks they know everything after reading three blog posts has the same problem. So does the nihilistic skeptic who dismisses all expertise because "experts have been wrong before."
Both share the same character deficit: they have never developed the metacognitive skill of knowing when their own thinking is reliable.
Where This Leaves Us
Eventually, I figured out the moon wasn't following us. I don't remember the exact moment. But I changed my mind.
The thing is, I'm not sure we have cultural infrastructure anymore for that kind of mind-changing.
We have schools that teach information, but not calibration. We have media that trades in certainty, not careful updating. We have platforms that reward conviction, not reconsideration.
The researchers acknowledge they don't know how to teach intellectual humility at scale. Maybe that's the real work ahead: not building better fact-checking tools, but building better people.
People willing to hold their beliefs lightly enough to change them.
The Moon Is Still There
My kids know the moon isn't following us. But they also know the story of their dad insisting it was, with full childish certainty.
Because expertise isn't dead. It never died. What died, or maybe what we never fully developed, is our ability to do what I eventually did: look at the evidence, feel the discomfort of being wrong, and let go of certainty in favor of truth.
References
Prike, T., Holloway, J., & Ecker, U. K. H. (2024). Intellectual humility is associated with greater misinformation discernment and metacognitive insight but not response bias. Advances in Psychology.
https://advances.in/psychology/10.56296/aip00025/
When's the last time you changed your mind about something that mattered?
About Enthusiastic Generalist: This blog explores ideas across disciplines. Science, leadership, faith, parenting, book reviews, personal essays, and occasional deep dives into whatever I find curious. If you enjoyed this post, consider subscribing.

