Shortly after first arriving in Washington, D.C., I had conversations with friends in which I made this observation: Assume that they and I hold completely different views on an issue. Assume, too, that we engaged in a debate on the issue and that they pulverized me based on their superior knowledge and logic. And let’s stipulate a third assumption: I knew, deep in my bones, that I was bested. Still, the odds are that I wouldn’t revisit my opinion; instead, I would probably get angry that my case had been demolished. What this would indicate is that my positions were ones I held not primarily based on reason and empirical evidence but because of certain predilections, biases, and intuitions.
My arguments might be exposed as weak, but my faith in my position would likely remain strong.
I thought of these conversations after watching this interview with Jonathan Haidt, a professor of psychology at the University of Virginia. Professor Haidt, author of The Righteous Mind: Why Good People Are Divided by Politics and Religion, argues that reasoning is “post-hoc and justificatory.” Reasoning is not good at finding the truth, according to Haidt. He argued that “conscious verbal reasoning is really good at confirming.” We’re like good lawyers or press secretaries; we seek out information to reinforce our existing opinions and try to justify everything. Once we sacralize something, we become blind to counter-evidence.
I know precisely what Haidt is talking about. It’s extremely easy to spot the weak arguments, hypocrisy, and double standards of those with whom I disagree; it’s much harder to see them in myself. And many of us, having arrived at comfortable, settled positions, go out in search of evidence to support our arguments. That is quite a different thing than assessing evidence in order to arrive at an opinion. What most of us do, to one degree or another, is self-segregate. We search for studies and data that confirm our pre-existing beliefs. And we tend to ignore the strongest arguments against our position.
This is a complicated matter. Our underlying views are not necessarily sub-rational; they are often grounded in moral intuitions and attitudes that are entirely legitimate. What we do in political debates is to extend what we take to be true – and in the process, we reach for evidence that conforms to what Edmund Burke referred to, in an uncritical way, as our prejudices.
We channel facts in a way that reinforces views that are based on something different than—something deeper than—mere empirical evidence. None of us, then, are completely open-minded; and we’re all understandably reluctant to alter deeply-held views. The question, really, is given all this, how open are we to persuasion, to new evidence, and to holding up our views to refinement and revision? How do we react when our arguments seem to be falling apart? And what steps can we take to ensure that we don’t insulate ourselves to the point that we are indifferent to facts that challenge our worldview?
According to Haidt, individual reasoning is not reliable because of “the confirmation bias”—and the only cure for the confirmation bias is other people. “If you bring people together who disagree,” he argues, “and they have a sense of friendship, family, having something in common, having an institution to preserve, they can challenge each other’s reason.” We’re not very good at challenging our own beliefs—but we’re quite good at challenging the beliefs of others. Our task is (to borrow from William Saletan’s review of Haidt’s book) “to organize society so that reason and intuition interact in healthy ways.”
That makes great sense to me. There’s a natural tendency to seek out a community of like-minded individuals who can offer support and encouragement along the way. In The Four Loves, C.S. Lewis writes that a friendship is born when two people discover they not only share common interests but see the same truth, who stand not face-to-face (as lovers do) but shoulder-to-shoulder. There’s an important place for intellectual fellowship, just as there is for religious fellowship.
Still, it’s important to resist the temptation to surround ourselves exclusively with like-minded people, those who reinforce our preexisting views and biases. It becomes much too easy to caricature those with whom we disagree. (In those rare, self-aware moments, and sometimes with a gentle assist from others, it becomes obvious when I’m guilty of this.)
In the White House in particular, where you have access to more information than is available to most people and are surrounded by some of the leading experts in the country, it’s tempting to think that you and your colleagues are all-wise and your critics are all-foolish. And before long you can find yourself in an intellectual cul-de-sac. That’s a dangerous place to be. We need at least a few people in our orbit who have standing in our lives and who are willing to challenge what we claim and how we claim it. That is, I think, an important, even essential, element when striving for intellectual honesty.
Peter Wehner is a senior fellow at the Ethics and Public Policy Center.