In one of those odd synchronicities, I’ve been having more in-person conversations recently about cults in general and conspiracy theories* in particular. We’ve been gnawing around this question of what causes people to believe in things in the first place—to believe in anything at all, whether conspiracy theories or established religion.
I don’t have any answers, though it’s curious to me that it seems easier for some kinds of minds to believe in that which can’t be proven than in that which can and I keep coming back to wondering if fear is at the heart of it.
Cory Doctorow recently published a book on Medium (it’s 109-minute read and well worth it) called How to Destroy Surveillance Capitalism. Before getting into the very real, very urgent problems of surveillance capitalism itself, he wrote probably the smartest thing I’ve read on the disconnect between conspiracy theories and facts:
What if the trauma of living through real conspiracies all around us — conspiracies among wealthy people, their lobbyists, and lawmakers to bury inconvenient facts and evidence of wrongdoing (these conspiracies are commonly known as “corruption”) — is making people vulnerable to conspiracy theories?
“Trauma, not contagion,” he reiterates in the next paragraph, is what we need to look at to understand the divisive and often violent rhetoric flaring up all around us.
There’s something in that, but I also, again, look at embodiment, or lack thereof, as part of the problem, exacerbated by fear of what we can’t know or control. Why is it so hard for large numbers of people to believe that scientific research on, say, climate change, reflects real situations? Part of it’s identity—denying climate change tends to be connected with a particular political identity, and nothing is harder to change than a person’s view of their own identity because it’s who they are—but part of it is also in how we teach science, as cold research unconnected to the lives we live and the planet we live them on.
Some science writer friends and I had a conversation many months ago about the failings and pitfalls of science communication, and the responsibility of those of us who write about scientific research to help people understand it. But when people don’t see science as having anything to do with their lives, even the best communication can only go so far.
When my son was in 5th grade, he looked forward to his first science lessons. He looked forward to them until the first one, which focused on learning the scientific method and didn’t leave that subject for at least two weeks, quickly instilling an aversion to science in who knows how many formerly eager school kids. Now, I have nothing against the scientific method. It is obviously a very important thing to learn. But teaching it before kids have a chance to run their own messy experiments, try out their own observations of something they’re curious about—in short, before they come to know and love science as a way to explore and investigate everything they want to understand about the world—leaves them feeling that science is something “other.” It doesn’t have to do with them, doesn’t have to do with life. You can explain six ways from Sunday that your washing machine uses fuzzy logic in order to function and nobody will care, even though fuzzy logic is a really amazing thing to think about and is a reason the Mars Rover can walk around.
I think of science education as equivalent to that period of time in parenthood when you agonize over whether to just let your kid touch the hot stove or not. Because no matter how many times you tell them it’s hot and it will hurt, they will never totally believe you unless they’ve experienced it themselves. Without that experience, without seeing, feeling, smelling, messing around with the world and seeing what it can do (in a fun poking-sticks-in-the-mud way, not Facebook move-fast-and-break-things way), how can we expect people to believe what they haven’t experienced simply because some scientist said so?
The problem in early-age science education is reflected in our tendency to believe conspiracy theories and sloppy science not because people are stupid or gullible, but because life is full of uncertainty anyway and a poor science education makes that uncertainty an even scarier thing. It translates to a fear of making mistakes because mistakes are penalized by a lower grade; it translates into thinking science is a realm for the “smart kids,” not for those of us who struggle to get it. So some look for answers that gives them a more solid footing in the world, and others go further, looking for answers that don’t rely on the smart kids interpreting reality for them.
One essay that came into my inbox this week was a short one on Karl Jaspers, a philosopher I’d never heard of before who urged his fellow Germans to face the realities of their country’s recent actions after World War II. He also, wrote the author, made a life’s work of arguing for becoming comfortable with uncertainty:
Jaspers believed that we might not be able to come to an agreement about who we are and what we want to be, but we can agree on what we don’t know and how we’d like to act toward this nonknowledge. . . . ‘All thoughts,’ Jaspers therefore concluded, ‘could be judged by this touchstone question, do they aid or hinder communication.’
I wish I’d come across Jaspers earlier in life. I will never persuade a believer in conspiracy theories that they’re wrong, and it’s long since evident that more facts don’t change a climate change denier’s mind. But wondering what could further communication might be more productive. And in the long run, a better, more hands-on, messy science education full of dead-ends and mistakes could teach us both science’s potential and its limits, and help us become comfortable with uncertainty.
______
*Still looking for another word for “conspiracy theory” that is less loaded with dismissiveness. The closest I’ve come is “a theory that purports to explain pretty much every perceived wrong in the world and names the specific people supposedly responsible.” That is obviously not going to work.
I think it comes down to storytelling too, because that is a large part how we learned about the weirder parts of the world up until about five minutes ago. Believing in Sasquatch because of the great stories is fun. Believing in Jesus and all the biblical calamities might not always be fun, but they are compelling. I learned earlier this spring, while listening to and reading with a number of earnest young environmental writers parading essay after essay related to climate change that, unless you personalize the story and make it relatable and emotional, they are just dull, regardless of the stakes. I don't know what the answer is. But I do know that a series of what seems like "elitist know-it-alls" (and yes, the Left, which is where the bulk of climate change information comes from, has an elitist-know-it-all problem) citing stats and demands isn't going to win over any of the, willfully or not, ignorant. Ignorant being another word that has an image problem, because I don't use it here in a negative way. I'd say the same can be said about COVID. People don't believe until it hits them right where it hurts, until they have a personal story that isn't about "the government taking something away from them for no reason."
Heh. I couldn't immediately think of a less, erm, "toxic" term for "conspiracy theory," but "blame game" did come to mind as one equally loaded. What I think this way of thinking does is try to justify "us" v. "them" and make us feel better about who we are, unfortunately without improving anything else about the situation and perhaps making it worse.
As a side note, here's a family who seems to do well at teaching their kids to be curious and capable: https://www.youtube.com/channel/UCr9ib9quyHJkEchOck4PG2w
Thanks for the post, and may your land be at peace.