---------------------------------------------------
The moon is made of cheese — mozzarella, to be precise. By saying that, I may have signed my own death warrant. You see, they don't want us to know. They'll claim I'm mad. But as Kurosawa said, "In a mad world, only the mad are sane."
"But men have walked on the moon," you say. Wrong. It was all a fake, filmed in a studio by NASA. Haven't you seen the movie Capricorn One? If it weren't for lawyers, that would have been billed as a documentary.
"But other non-manned trips have been made to the moon." Most of them were fakes too. Some weren't, and those were the ones that brought back samples proving the mozzarella theory. But of course, the evidence has been suppressed.
"But people can look at the moon through telescopes." Right, and you're telling me that you can tell from that whether the moon is hard rock or soft cheese?
"But if this were true, surely it would have got out." Would you keep quiet, perhaps getting paid off handsomely; or be killed or discredited as a madman?
Think about it: how else would Elvis be able to stay alive up there if he didn't have an endless supply of cheese?
Baggini, J., The Pig That Wants to Be Eaten, 2005, p. 181.
---------------------------------------------------
We've all seen this guy, right? He might even be a family member. But usually, the time he spends in our lives is an outlier. In a 2015 study, researchers found "an astounding 91.53 percent of people who like posts on conspiracy theory pages pretty much only engage with conspiracy theory pages." That kind of isolation is surely vital to the staying power of the wackiest conspiracy theories, but what about less extreme claims like Obama's foreign birth or climate change denial? In his discussion of this thought experiment, Baggini noted that conspiracy theories are made possible because of two limitations of knowledge formation. In philosophical terms, these are two problems of epistemology:
- The holistic nature of understanding—i.e. the way that any single piece of knowledge is connected to countless other beliefs, all of which are bound together and usually reinforce one another.
- The "underdetermination of theory by evidence," which in plain language means that facts never provide enough evidence to conclusively prove one theory and one theory only. There is always a possibility that an alternative theory is true, which is why courts insist only on proof beyond "reasonable doubt."
This aligns with one of the main tenets of evolutionary philosophy, which is that knowledge is only ever probable. According to this worldview, knowledge comes from using reason to understand our sense experiences, and the iterative nature of the scientific method is what hones this process towards truth. But because we cannot know the future, that truth is always contingent, it is subject to potential change because new evidence might just come along to contradict anything we currently believe. Anything and everything.
But still, moons made of cheese? Isn't that beyond a reasonable doubt? Isn't blind skepticism just as bad as blind faith?
Maybe so, for as Plato said, "Knowledge is justified true belief." And as David Hume said, "A wise man proportions his belief to the evidence." So yes, the seemingly tiny potential still lurks out there for a revelation that an evil demon or a computer simulation is currently manipulating reality so that 2 + 2 only looks like it equals 4, but you would have to be very unreasonable to rely on things like that and ignore all the evidence that the moon is really, really likely to NOT be made of cheese. In my Response to Thought Experiment 3: The Indian and the Ice, I already looked at how we accept or reject new evidence, and I pointed out the importance of being aware of the lengthy List of Fallacies on wikipedia (which contains 128 separate entries). But there are also the 174 cognitive biases listed on wikipedia too, which are "tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment." Given such a "standard for rationality," it's obvious from the dialogue above that the conspiracy theorist is committing several errors. And so, rationally, changing our minds when new evidence arises or better arguments come along should be easy. Right?
Ha!
This is where this week's thought experiment turns interesting. I'm sure everyone reading this has sufficient evidence from their own experiences about the difficulty of changing beliefs, but even in the supremely rational world of philosophy, discussions are still plagued with stubbornness. In his excellent book Cosmopolitanism: Ethics in a World of Strangers, author Kwame Appiah wrote:
"I am a philosopher. I believe in reason. But I have learned in a life of university teaching and research that even the cleverest people are not easily shifted by reason alone. Conversation, as I’ve said, is hardly guaranteed to lead to agreement about what to think and feel."
What causes this? One issue I've come to believe is that speaking to someone in a debate probably moves too quickly to change their deeply held beliefs. I regularly listen to The Moral Maze now and the panel members discussing philosophy issues with guests there predominantly talk right past one another. No one is ever persuaded. In Daniel Kahneman's book, Thinking Fast and Slow, we might just see why in his description of the two different ways our brains form thoughts. They are:
- System 1: fast, automatic, frequent, emotional, stereotypic, subconscious
- System 2: slow, effortful, infrequent, logical, calculating, conscious
So that's one error—expecting people to quickly change their whole interconnected rationales when our brains are not wired that simply—but there is another mistake that is typically made by advocates for change too.
In Rebecca Newberger Goldstein's novel 36 Arguments for the Existence of God, one of her characters notes that, "The work of ethics is the work of getting one’s self to this ‘View from Nowhere’ and keeping it relevant to how one sees the world and acts. There are truths to discover in that process, and they’re truths that make us change our behavior." Although this sounds great, and it works for a philosopher locked in a room with his or her strong System 2 brain at work, we humans are predominantly social creatures who always have a "view from somewhere," and most people rely heavily on others to inform it. We can't all be experts in everything. And In fact, cooperative species evolve to be composed of a majority of followers. Vice versa simply wouldn't work.
This point was made well in a Conversation with Richard Dawkins, a BBC Radio 3 event that was part of the 2015 Free Thinking Festival. There, a child of about 10 in the audience asked Dawkins if "religious blokes are crazy or are they just brought up like that?" Dawkins' response was illuminating:
"Oh gosh, what can I say? Brought up that way. The great majority of people have the same religion as their parents, and that really gives the game away. The fact that children inherit the religion of their parents and the fact that we as a society label them as such is what drives this. We call a child of two a Catholic child, as if they had the faintest idea of what that means. A midwife will hold a newborn baby up and ask, "What religion is he?" It's shocking if you think about it. It's just as absurd as asking, "Is he a logical positivist? Is he an existentialist? Is he a Marxist?"
I love that response. But it's not just religions that are passed on by cultural surroundings, and there is more to prove it than an off the cuff remark from Richard Dawkins. In an article titled We Aren't the World, some of the depth of cultural influence was explored in this passage:
For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve. [In a paper by Henrich, Heine, and Norenzayan titled “The Weirdest People in the World?”], Henrich has challenged this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own. He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way. When Henrich asked Fijian women why they avoided certain potentially toxic fish during pregnancy and breastfeeding, he found that many didn’t know or had fanciful reasons. Regardless of their personal understanding, by mimicking this culturally adaptive behavior they were protecting their offspring. The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.
Taking cues from your culture is one of those cognitive biases that generally works well, but there are dark sides to it too, and it is not easily overcome. In a New York Times op-ed piece by David Brooks titled Breaking the Echo, we see why:
It is well known that when like-minded people get together, they tend to end up thinking a more extreme version of what they thought before they started to talk. The same kind of echo-chamber effect can happen as people get news from various media. Unfortunately, evidence suggests that balanced presentations — in which competing arguments or positions are laid out side by side — may not help. On the contrary, people’s original beliefs tend to harden and the original divisions typically get bigger. Balanced presentations can fuel unbalanced views. What explains this? The answer is called “biased assimilation,” which means that people assimilate new information in a selective fashion. When people get information that supports what they initially thought, they give it considerable weight. When they get information that undermines their initial beliefs, they tend to dismiss it.
So balanced presentations don't help to moderate behavior, and ignoring other points of view through filter bubbles doesn't either. From an evolutionary perspective, this kind of isolation for individuals or ideas leads to divergence from other populations of individuals or ideas. Relative to one another, one of these diverged groups must have less fitness for survival. In today's globally interconnected world, however, these groups will eventually come together and then mutually exclusive ideas (e.g. free markets vs. communism; catholic inquisitions vs. pantheism; economic dominance vs. environmental protection) must either adapt or go extinct. It would be much better for all involved if the changes didn't have to come from violent conflict (might doesn't make right), so it's vital we figure out how to root out truly maladaptive thoughts by using reason alone. How can we do this?
Cultures have changed. And not always by violent revolution. So we know this is possible. Fortunately, many solutions are starting to be uncovered. Later in the David Brooks op-ed, he cites an example of the potential for lightening strikes of change from one person to another.
Can anything be done? There is no simple term for the answer, so let’s make one up: surprising validators. People tend to dismiss information that would falsify their convictions. But they may reconsider if the information comes from a source they cannot dismiss. People are most likely to find a source credible if they closely identify with it or begin in essential agreement with it. In such cases, their reaction is not, “how predictable and uninformative that someone like that would think something so evil and foolish,” but instead, “if someone like that disagrees with me, maybe I had better rethink.”
This may be helpful if you've already undergone a life-altering conversion and just want to go back to a group you are already an accepted member of and try to change some of their minds by being their "surprising validator." But how do you undergo the life-altering conversion in the first place? The System 2 philosopher focusing on their thoughts in their room is one way. But there has been some excellent research done on other ways, and I'd like to quote from four sources at length about this because they are so good and this topic is so important. If you're less interested in the details of this than I am, skip to the end and just know that people are figuring out how to change you... : )
First, emeritus professor of applied psychology and human development Keith Stanovich wrote an excellent article in Scientific American about his research called: Rational and Irrational Thought: The Thinking That IQ Tests Miss. He said:
---------------------------
No doubt you know several folks with perfectly respectable IQs who repeatedly make poor decisions. The behavior of such people tells us that we are missing something important by treating intelligence as if it encompassed all cognitive abilities. I coined the term “dysrationalia” (analogous to “dyslexia”), meaning the inability to think and behave rationally despite having adequate intelligence. It is useful to get a handle on dysrationalia and its causes because we are beset by problems that require increasingly more accurate, rational responses. IQ tests do not measure dysrationalia. But as I show in my 2010 book, What Intelligence Tests Miss: The Psychology of Rational Thought, there are ways to measure dysrationalia and ways to correct it.
Decades of research in cognitive psychology have suggested two causes of dysrationalia. One is a processing problem, the other a content problem. Much is known about both of them. The processing problem comes about because we tend to be cognitive misers. [i.e. We have cognitive biases.] ... The second source of dysrationalia is a content problem. We need to acquire specific knowledge to think and act rationally. Harvard cognitive scientist David Perkins coined the term “mindware” to refer to the rules, data, procedures, strategies and other cognitive tools (knowledge of probability, logic, and scientific inference) that must be retrieved from memory to think rationally. The absence of this knowledge creates a mindware gap—something that is not tested on typical intelligence tests.
...for most people, this bit of mindware must be taught until it becomes second nature. As research in my lab and elsewhere has shown, rational thinking can be surprisingly dissociated from intelligence. Individuals with high IQs are no less likely to be cognitive misers than those with lower IQs. Avoidance of cognitive miserliness has a correlation with IQ in the range of 0.20 to 0.30 (on the scale of correlation coefficients that runs from 0 to 1.0). Sufficient mindware has a similar modest correlation, in the range of 0.25 to 0.35. These correlations allow for substantial discrepancies between intelligence and rationality. Intelligence is thus no inoculation against any of the sources of dysrationalia I have discussed.
---------------------------
Stanovich gives six puzzles in his article which require the "mindware" of probability, logic, and scientific inference to solve. I highly recommend reading the article in full and trying to solve those puzzles for yourself to see what mindware updates you might consider getting or sharing with others.
The second source on change I'd like to note comes from the management consulting guru Edgar Schein in his book Helping: How to Offer, Give, and Receive Help. Any time you want someone else to change, you are essentially asking to help them make a change for the better. But such situations are fraught with difficulty, which Schein describes well based on his decades of experience with the matter. He writes:
---------------------------
Helping situations are intrinsically unbalanced and role-ambiguous. Emotionally and socially, when you ask for help you are putting yourself “one down.” It is a temporary loss of status and self-esteem not to know what to do next or to be unable to do it. It is a loss of independence to have someone else advise you, heal you, minister to you, help you, support you, even serve you.
[So how can you change / help someone?] A central proposition in helping can now be stated. Any helping situation must begin with the helper adopting the process consultant role in order to do the following three things: 1) Remove the ignorance inherent in the situation. 2) Lessen the initial status differential. 3) Identify what further role may be most suitable to the problem identified.
The essence of the process consultant role at the beginning of a helping relationship is to engage in humble inquiry. The kind of communication process that will most equilibrate the social statuses of client and helper is for the helper to give something of value to the client. It is the client who is initially one down and therefore vulnerable to being confirmed as indeed being of less value for having a problem. It is the helper who must enter this dynamic in a supportive, giving, ego-enhancing way. The first intervention must always therefore be humble inquiry, even if the inquiry is merely careful observation and listening in the first few moments of the encounter.
One of the most counterintuitive principles of managed change is that you can’t change anyone until you can turn them into a client who is seeking help from you. Leaders who want to fix things will be the most successful if they initially adopt a helping role which, in turn, requires their willingness to be helped. One of the truisms about change is that people don’t mind change; they just don’t want others to change them.
---------------------------
I find that to be a beautiful explanation of why truly humble inquiry really matters before change is possible. I also think Schein ends up defining an important distinction between this type of inquiry compared to the often antagonistic Socratic method of inquiry as depicted by Plato where the know-it-all philosopher asks leading questions until his unequal partner in the dialogue sees the light of day. In real life, that gets you punched. Or poisoned with hemlock. I wish I had read this while I was still a consultant.
The third source on change I want to cite comes from social psychologist Timothy Wilson in his book Redirect: The Surprising New Science of Psychological Change. This book is full of powerful research experiments that have changed lives for the better, and explains the theoretical bases for why they work.
---------------------------
A central principle of social psychology [is] that people are motivated to perceive themselves as good, competent, moral people, and that when that view is threatened, they do what they can, psychologically, to repair their self-image. ... Hundreds of experiments, conducted mostly in the laboratory with college-student participants, have documented how important it is for people to maintain a positive self-image and the lengths to which they will go to accomplish this. Typically, people try to deal with threats to their self-esteem directly. But what if that doesn’t work? Human beings are excellent rationalizers, and are great at finding an explanation that deflects blame away from themselves. Rationalization works best if it occurs behind the scenes, unconsciously, so that one doesn’t know they are coming up with ideas in order to make themselves feel better—they are simply telling it like it is. But even rationalization has its limits. I can only go so far with the “misunderstood genius” defense before it crumbles in that face of reality.
Cognitive Behavioral Theory assumes that maladaptive interpretations—negative thought patterns—are responsible for many mental health problems, and that the best way to treat those problems is to make people aware of their thought patterns and learn how to change them. Story editing is a set of techniques designed to redirect people’s narratives about themselves and the social world in a way that leads to lasting changes in behavior. This approach is most useful for people who have failed to come up with a coherent interpretation of an important event in their lives. ... The traumas that cause prolonged stress are usually the ones that we can’t make sense of; they are profoundly troubling because they seem like meaningless, random acts that don’t fit into our view of the world as a predictable, safe place. [Ed. note: Which any study of evolutionary history will show that it is not a safe place.] Giving people the opportunity to affirm themselves, even with a task as simple as writing about the values that are important to them, turns out to be a powerful prophylactic against threats to their self-esteem.
We have powerful techniques at our disposal based on the story-editing approach. This is a family of techniques that share three assumptions: first, in order to change people’s behavior we have to see the world through their eyes. It’s not just about incentives, as it is to an economist; it’s about the way in which we interpret ourselves and the social world. Second, these interpretations are not always set in stone, and in fact can be redirected with just the right approach. Third, small changes in interpretations can have self-sustaining effects, leading to long-lasting changes in behavior.
---------------------------
The book contains lots of examples of how asking someone to write a particular story can have lasting effects on how they go on to frame their interpretations of the rest of the world. As an author who wants readers to reframe their worlds, I find this fascinating, but there appears to be extra power in asking people to write the stories for themselves. Hmmmm. Guest blog anyone?
Finally, evolutionary biologist David Sloan Wilson—who is setting up evolutionary-based think tanks and magazines to change the world—shared some research on change in his 2015 paper: Evolving the Future: Toward a Science of Intentional Change. He and his co-authors wrote:
---------------------------
When the cognitive revolution dethroned behaviorism in academic psychology during the second half of the 20th century, behaviorism did not disappear. Instead, it developed into a robust set of methods for accomplishing behavioral change in a variety of applied disciplines. Behavior therapy was gradually supplemented (not replaced) by cognitive therapy, which in turn has been supplemented by acceptance and mindfulness-based techniques with proven efficacy, in what is sometimes called a “third wave” of cognitive behavioral methods.
Behavior therapy works by altering the selective environment: for example, by repeatedly exposing clients who fear spiders to the objects of their fear without adverse consequences so that they can acquire a wider range of responses besides avoidance in their presence. ... Cognitive behavior therapy goes beyond behavior therapy by encouraging clients to re-conceptualize their problems. ... A variety of evidence-based practices have emerged over the last few decades...such as mindfulness meditation, attentional training, emotional acceptance, and deliberate use of perspective taking. ... Therapies that teach people simply to notice their thoughts without automatically having to obey them induces healthy flexibility that can help people solve [their problems].
A science of positive intentional change is surprisingly close, once successful research programs in the applied behavioral sciences are related to core evolutionary theory. The principles that we have outlined for individuals are equally relevant to groups of all sizes. Groups can benefit by increasing their behavioral flexibility and reflecting upon their values in selecting their practices, no less than individuals. However, an additional set of considerations are required for groups to function as “corporate units” in this sense.
Elinor Ostrom shared the 2009 Nobel in economics for showing that groups of people are capable of managing their common resources on their own, but only when certain conditions are met. Empirically, she was able to identify eight design features that enable groups to manage their common-pool resources successfully: 1) Strong group identity and understanding of purpose; 2) Fair distribution of costs and benefits; 3) Fair and inclusive decision-making; 4) Monitoring agreed-upon behaviors; 5) Graduated sanctions for misbehaviors; 6) Fast and fair conflict resolution; 7) Authority to self-govern; 8) Appropriate relations with other groups.
A grade-school teacher invented a set of practices called “The Good Behavior Game" (GBG), which prevention scientists have refined and assessed over a period of decades. The game, as played in several thousand classrooms today, has most, if not all, of the core design features identified by Ostrom for common-pool resource groups. The GBG begins by establishing norms of good behavior by consensus. Even first graders are able to list the appropriate dos and don’ts: but the important fact is that they are their lists and not lists arbitrarily imposed upon them by the teacher and school. Once the norms of good behavior have been established and suitably displayed in the classroom, the class breaks up into groups that compete to be good while doing their schoolwork. Groups that manage to avoid a certain number of misbehaviors receive a small reward, such as picking from a prize bowl of activities like singing a song or dancing for a minute. At first, they play the game for brief periods with immediate rewards. Gradually the game lengthens and occurs without any previous announcement. The rewards gradually appear later, until the end of the day or week, until the norms of good behavior become the culture of the classroom. Potentially destructive aspects of between-group competition are managed by periodically shuffling the composition of the group.
---------------------------
I think the Good Behaviour Game could be adapted for all kinds of groups that need to change their specific behaviour, but I'm more interested in Elinor Ostrom's work. Her research on what it takes for groups to "manage common resources" (i.e. avoid the tragedy of the commons) is particularly applicable to all of humanity and the common resource we share called Earth. Could we make use of her work, and all of the rest of the science on change in this article, to help change the world for the better and ensure our survival? Only if Ostrom's first criterion is met: Strong group identity and understanding of purpose. And that brings me back to the core of EvPhil, which is my effort to clarify and spread the universal, objective basis for morality that I have argued for. Once I figure out how to change people's understanding of that, then it's only a few short steps to figuring out how to change the world....