Of Scouts and Soldiers
The late Richard Feynman, one of the great scientists of the 20th century and a notable influence in my physics education, once said “The first principle is that you must not fool yourself — and you are the easiest person to fool.” Feynman understood what cognitive science later showed that “when it comes to what we believe humans are masters of self-deception…we see what we want to see.”
Julia Galef in her book The Scout Mindset, Why Some People See Things Clearly and Others Don’t observes that people are motivated to defend their beliefs against evidence or arguments that might threaten them. Julia calls this human tendency a soldier mindset and it stems from a natural human desire to belong, to live in harmony with family, community, and likeminded others. That belonging unleashes a powerful, overriding force where reasoning becomes like defensive combat, ever ready to defend one’s side like a soldier defending hard-fought territory.
As individuals defend their side, picking and choosing evidence in support of that defense, their underlying values change through an identity that side now owns. They think they are objective because they feel objective. They analyze their own logic and find it sound. They call ourselves “rational” which means they like their own argument. They see themselves as fair and unbiased because that’s how motivated reasoning works. But, in the words of Jon Haidt, they become blind and bound to the sensibilities of their groups — and don’t know it.
What if individuals saw things as they are, not as they wish they were? Instead of defending one’s territory, what if they valued accuracy through a practice of being skeptical about what they know as they challenge their assumptions with a willingness to change course? Julia calls this practice a scout mindset where the goal isn’t to defend the existing map of beliefs, but to survey the territory with a goal of making a better map as accurate as possible.
A scout prioritizes curiosity and openness to evidence with a willingness to be wrong. Mistakes are part of the process, seen as corrective rather than feared. A scout embraces incremental change through a process of continual updating. They resist the temptation of certainty, unafraid of evidence contrary to their beliefs and viewpoints. They pay close attention to their analytic methods, relying upon a data driven process that is consistently and comprehensively applied. They resist extrapolating anecdotes. Because accuracy is the goal.
Both the scout and soldier mindset serve as theoretical archetypes as neither can be perfectly followed. People are usually a mixture of both, according to context and issue. But when communities capture one’s identity and turn allegiances towards their cause, their reasonings become more motivated and a permanent soldier mindset sets in. Especially in today’s culture war driven world where the stakes are sky high and outcomes apocalyptic.
You can’t detect motivated reasoning by pure self-examination. Most people see themselves as reasonable, smart and knowledgeable, and aware of motivated reasoning — characteristics which seem like they should produce a scout mindset — yet they still function more like soldiers than scouts.
Each one of us need a healthy skepticism about ourselves and our tribe, mindful of how our need to belong binds us to our groups. We all need an outside perspective, someone on the other side who can point to a different perspective which we cannot see ourselves. And only a few of us have that rare ability to understand the opposing side, able to comprehend each side’s motivations and presuppositions, free from the ever-present influence of tribal sensibilities.
Five discriminating questions about whether we are soldiers or scouts
1. Are we intellectually honorable, truth seekers through curiosity and an openness to change our views?
2. What do we find more trustworthy: the opinions of select people or experts who share our values, or an objective, systematic, weighing of valid evidence and expert judgement across the spectrum?
3. Do we pick and choose facts or is our reasoning consistently applied across all relevant data?
4. Do we hold our tribal identities loosely, willing to resist defending our side when a belief or argument goes against that of our group?
5. Are views from outside of our group deemed essential for constructing accurate viewpoints?
Let’s try this out, recognizing that we all are are poor self-assessors, calling ourselves objective, rational, fair, and unbiased when we’re often not. But it’s worth a try and maybe, through thinking about these issues, we will become more self-aware about underlying drivers of our thinking.
For each of five issues or beliefs, what if we score them on a scale of 1–5 using the following five questions?
· Open to change? 1 = certain of our views; 5 = no preconceptions
· Trustworthiness? 1 = opinions from those who share our beliefs; 5 = systematic analyses of all data/experts
· Was our analytic process consistent with those used in other, less controversial, issues? 1 = no; 5 = yes
· Do we rush to defend our “side” in this issue? 1 = quick to defend; 5 = not a dog in the fight
· Do we require views outside of our “side” to ensure we find the truth of this issue? 1 = rarely; 5 = usually
In each of the five questions, the lower the score, the more the soldier mindset. Hence if we sum across the five questions for each issue, a pure soldier mindset would yield a score of 5 and a pure scout mindset a score of 25.
One issue pertains to a life risk (Round-up) in which the science is mixed, both in the peer reviewed literature and with expert opinion. The other four issues address a life risk where there is either a near concordance of findings from peer-reviewed scientific research and/or widespread acceptance by credentialed experts.
Recognizing, at the same time, there are a minority of experts or studies which espouse an alternative point of view for each of these four other issues. For some issues such as smoking, and cancer risk, this alternative view is likewise rejected by the public resulting in public opinion closely matching that of scientific opinion. But for others, such as covid vaccine safety, a sizable portion of the public disagree with the predominant view of the experts.
So here’s the test. Do our truth judgements follow the same process in all five cases? When we encounter differing issues in life, especially those issues that affect our physical health (treatments for disease, use of toxic substances, medication or drugs, etc) does our approach change depending upon the issue? To wit: how do our scout/soldier scores compare across issues that get attached to the culture war — where one or both of the sides acquires a partisan identity?
Again, recognizing of course, that we are often poor self-assessors and the best answers will come from scout-like people outside of our tribal allegiances who know us well enough and who are courageous enough to speak a truth we’re willing to receive.