Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Beliefs are safety-seeking

That the United States is riven into two cultures who won’t date each other is so circumambiently obvious to every American that when someone points it out it’s hard to respond with anything other than rolled eyes, because it’s sort of like pointing out the air. Yet even though we’ve been going through this for hundreds of years on-and-off, we’re still using naive enlightenment-era persuasion tactics to address the problem:

  • People don’t want to take the vaccine? Let’s tell them how it works, tell them about the safety studies.
  • People think the earth is flat? Let’s do the experiments that demonstrate its curvature and share the proof.
  • People think globalization and entitlements are the prime movers of job loss? Let’s point them at the Department of Labor’s studies on automation.

Think back to a time, any time, when you presented someone with evidence that one of their core beliefs was incorrect, or at least sloppy. Or more difficult, a time when you yourself were demonstrably wrong and put on the spot about it. Did either scenario lead to courteous conversation and a changed mind?

Interpersonally, most of us learn not to challenge others in this way after making the mistake once or twice. It goes nowhere. The evidence-based strategy turns two interlocutors into greased hogs grasping for purchase on each other with inarticulate hooves. In this essay I attempt to explain why that is, and what alternatives are available.

Human reasoning is unanimous

One easy explanation I see thrown around for why some people don’t respond to evidence is that some people are less reasonable than others. The problem with this is that humans are very similar on the whole, and since we’re talking about hundreds of millions people disagreeing on some pretty salient points, it’s a safe heuristic to reject any explanation that does not also apply to ourselves.

In the book The Perplexities of Consciousness, Eric Schwitzgebel walks through several case studies of people being absolutely, utterly wrong about the basics of what’s going on in their own head, such as: does thinking have a unique feel? Or is thinking just a composite of other sensory memories? People disagree on this point very much. You can also sort of predict whether a human will report they dream in black-and-white or color based on whether their surrounding society watches black-and-white or color televisions, but if you wake them up in the middle of the night during a dream you get a different distribution of answers. Any argument based on the self-reported phenomenology of humans—such as: I can understand the evidence because I am reasonable—is dubious out of the gate.

Instead, if we look from the outside, human behavior suggests our reasoning ability is pretty much unanimous. When we get into a vehicle, or deposit money in a bank, or play poker, we are betting on other humans reasoning the way we do. We bet our lives and life savings on it daily. This is a lot more predictive of actual events than betting humans differ so much in their reasoning that some conclude illnesses are good and others conclude illnesses are bad.

There’s something more dangerous than being wrong

The most dangerous thing in the world is not being wrong, except for contrived cases like believing you don’t need air. There are several greater and more immediate threats:

  • A group of humans deciding to harm you
  • The group of humans you reside in, in which you enjoy safety in numbers from other groups that might cause you harm, deciding you’re not that cool and casting you out

Whether the threats are real is less relevant than the fact that natural selective pressure has spent millions of years enhancing the cognitive machinery with which we perceive the threats. Anyone who’s been a teenager can recall how peer inclusion and status felt like life-or-death imperatives, and that’s because to the teenage brain they are, and for good reason: loners historically die with haste. That you can survive in the modern world without close friends is anomalous in human history.

But why would this necessarily interfere with our ability to reason? You can have friends and family who hold views you abhor but who you’d still give a ride to the airport. You don’t have to agree on everything. Until your group is so large that it includes people you don’t know, that is.

Imagine it’s a super long time ago, before pencils or pens, and you and your friends and family are chilling in this awesome glade with tons of fruit and nuts and whatnot. There’s about Dunbar’s number of you. Now imagine that nearby, there’s a group of five thousand humans who want your glade. You’re borked. Groups that can enlarge beyond the constraints of interpersonal relationships win out over time.

Groups so large that you have to coordinate with strangers need an ideological foundation. They need a set of beliefs that allow members to predict the behavior of the strangers at least enough that they can cooperate without worrying about betrayal. A modern example of this is that most Americans share the narrative that the United States exists, that its government is legitimate, and that human rights are a thing. About our ability to create these ideologies, historian Yuval Noah Harrari says in his book Sapiens:

The real difference between us and chimpanzees is the mythical glue that binds together large numbers of individuals, families and groups.

So to belong to a group—which is required for your safety—and to maintain status in that group—also required for your safety—you need to hold on to the beliefs that unite the group and behave in a way that reassures everyone you sincerely believe. If you show signs of doubt, you become less predictable to your groupmates, a less sure bet, maybe even a potential threat if they suspect you’ll defect and betray the group’s interests.

The model offers a lot of explanatory power.

First, it explains why human reasoning can be so unanimous most of the time, but on certain issues take on a different shape. It’s not about the politics; most of us don’t understand the politics anyway, and aren’t literate in the history or complexities of the policies we profess strong feelings about. Our certainty is about staying safe.

This reminds me of a scene in Obama’s memoir Dreams from My Father, in which he’s trying to organize many parties with conflicting interests. He says of two such parties:

Both Marty and Smalls knew that in politics, like religion, power lay in certainty—and that one man’s certainty always threatened another’s.

Doubts orient us toward whoever promises safety, which means when one safety decays, there’s a vacuum to fill. In an NYT op-ed, Ross Douthat explains how he became receptive to alternative medicines, including a machine that purported to use sound to target certain bacteria in his body, because mainstream medicine failed to provide certainty of recovery. When we feel unsafe, our minds open up wide. The ability to detach from an unsafe group and commit to the beliefs of a new, safer group is an adaptive advantage over the true believers that go down with the ship.

Our sense of safety is a function of our personal interpretation of our circumstances, so people are going to have different thresholds for shedding faith. For example in a company that’s shrinking, people who need to continue to believing in the company’s continued existence to feel safe (because they wouldn’t be able to find another job), continue to do so for much longer than others. Andrew Yang, who while running Venture for America worked with hundreds of startup founders, says this in his book War on Normal People:

There's a truism in startup world: When things start going very badly for a company, the strongest people generally leave first. They have the highest standards for their own opportunities and the most confidence that they can thrive in a new environment.

A more tendentious example is the phenomena of theists questioning their faith after experiencing a hardship, such as a divorce or a death in the family. Neither divorce nor deaths in the family defy the rules of the universe, so why would these events and their consequential suffering inspire doubt in their understanding of reality? The emic answer is that it’s a critical moment in their relationship with %GOD_NAME_HERE%, but the etic answer is that the sense of safety that undergirded the belief has lost its credibility.

Some people’s needs for safety are distorted. They don’t just need acceptance. They need worship. Say you’re a doctor, and you’re getting the respect of non-doctors but among your doctor peers you’re just meh and no one’s heard of you and if they saw your practice they might even have some cutting critiques. But you want to be loved and admired and this craving is crippling you. What do you do? One option is to “blow the whistle” on the nanobots in the flu shot and become an idol. Trade in what little cachet you have with the group where your needs aren’t met for massive clout with the opposed group. Examples include: the atheist that accepted Jesus, the conservative politician that decided to adopt scruples, the financier that wrote a book about how financialization is destroying society. In religious circles the slur for someone who does this is apostate.

There are less extreme examples. The hipster acts on the same principle: they eschew the mainstream because they can’t achieve sufficient status in it, and try to attract others to their style, which if successful would confer upon them first-mover status.

Second, modeling beliefs as a social sorting tool explains tons of signaling behavior, like tweeting scathing takes you know will only egg on your people and piss off those other people, persuading no one, or the mystery of what motivates people to sit through sermons. In his essay Here Be Sermons, Kevin Simler argues that sermons, unlike lectures, are not about transferring their content to the listener, but rather about communicating to the whole congregation that the rest of the congregation is also aware of the sermon’s content. If the sermon one week is about the importance of being honest, then for that week everyone in the congregation knows their peers have a keen eye for dishonesty. It improves the predictability of group member’s behavior.

Belief is a signal

If it’s the case that one of the purposes beliefs serve, prioritized above even correctness, is signaling to other members of a group that the believer is trustworthy and safe and worth cooperating with, we would expect

  1. the signal to cost something
  2. beliefs to change based on who we need to cooperate with

and both of these are the case.

The signal costs a lot

Religious traditions like circumcision, fasting, and sacrifice, they don’t make sense considered in isolation. Why would behaviors that jeopardize our reproduction and make us temporarily weak persist for hundreds of thousands of years? I believe it’s because these costs are the price of the signal. If you didn’t really believe, would you execute your own son? Probably not, so if someone is willing to do that, we can be pretty sure they fear whichever deity required it of them.

A group organized by beliefs is going to have a mix of sincere believers and sociopaths trying to fake it. Televangelists who fly private jets for uninterrupted communication with God or whatever are probably not sincere believers for example—just my opinion—and the cheaper a signal gets, the more sociopaths will jump in to abuse the trust of the true believers.

Uncommitted believers will want to lower the signal bar so that they can enjoy the group’s benefits without its costs, but doing so makes the group more vulnerable to opportunists. The costs must stay high to protect the critical mass of sincere believers.

The unreasonableness of a belief is the strength of the group that professes it.

Beliefs change when our social environment changes

As it relates to U.S. politics, there’s a popular explanation of the fact that neighborhoods tend to be politically homogeneous that involves Americans self-sorting into red and blue buckets. It’s not true.

I’ve always found the self-sorting explanation dubious, not least of all because how many Americans can afford to prioritize political alignment when they move anyway? Greg Martin and Steven Webster investigated the sorting theory proposed in Bill Bishop’s book The Big Sort and didn’t think the numbers added up:

Next, we ran a large-scale simulation in which the rates of partisan sorting that we observed in our study were repeated over several election cycles. We found that Americans move frequently enough, and the partisan bias in moving is small enough, that if it were the only factor changing the geographic distribution of political preferences, then geographic polarization would actually go down substantially.

They conclude instead that the red and blue parties themselves won the allegiance of identities, such as class and race, who were already geographically sorted, and further:

Our study found that people who move tend to adopt the partisan leanings of their new locale.

They determined this by examining voter registration changes after a move, which begs questions about how many less strong-headed minds were changed, who had never been registered in the first place.

Maintaining a belief at odds with the people you need to cooperate with, at odds with your social world, incurs ongoing costs. It’s not a surprise that beliefs, motivated first by their social function, assume the shape of their social container.

Persuasion is about turning coats

Persuading people into or out of beliefs relating to their group membership requires changing their loyalties. To examine a belief, their safety must not be contingent on holding it.

I’ve been using the etic vocabulary so far for the sake of clarity, but I want to be clear that the utility of my argument begins and ends with understanding ourselves. If we can’t see in ourselves how our reasoning is distorted by social needs, we won’t be able to understand it in others either. The worst-case scenario of ideas like this is that someone gloms onto them and starts pointing out other people’s inconsistencies to them. This is not smart, just tedious and rude. I am inconsistent. I adopt and defend beliefs I don’t understand because I perceive a social cost to dissent. You do too.

I did some very lazy anthropology and asked some of my friends who had changed their core beliefs—into or out of a religion, changing political parties, etc—what motivated them to do so. There were two common themes in the timelines they provided. One of these preceded the switch:

  1. They made connections with people who held the new belief, connections in which they had mutual respect and felt safe.
  2. They felt alienated from the people or community who held the former belief.

This commentary has no bearing at all on the veracity of the adopted belief; it’s orthogonal. I myself see in retrospect that my transition out of religious belief was precipitated by a feeling of safety and inclusion among atheists. They invited me to sit with them at lunch in the intimidating high school cafeteria. They freely shared their bread bowls and the coveted mushy cookies that cost cash on top of the meal plan.

No counterargument to theism ever persuaded me. It was just that as I felt more and more safe in the outgroup, theist apologetics no longer needed to make sense, and then they ceased to do so. Arguments that once compelled me started to feel like academic exercises in specious reasoning.

I have no doubt the same mechanisms that impell arrival at a safe conclusion continue to assert themselves on my reasoning capacity. The role nature charges my brain to fulfill is not to discover truth, but to keep me sucking air, and that leaves a lot of axes of freedom for wrongness.

What can you do with this

We arrive at the peroration. If the idea I’ve explained here is predictive, it means most direct persuasive efforts would be best simply dropped. It means the most effective method of persuasion is simply being kind and including someone who believes differently—I’ll address the blaring alarm bells in your head at this trite prescription momentarily—and in terms of direct conversation, at most a street-epistemology-type approach is tenable.

The problem with this prescription as it relates to American problems is that American problems are hard. Are you going to invite a loud racist to your dinner party? Does it compromise your integrity to hang out with a Tucker Carlson fan if some of your friends or family are immigrants? What if the opposing viewpoint implies a legitimate threat to your safety?

To these questions, all I have to say is that they should remain questions. The alternatives to diplomacy, that I know of, are segregation and domination.

Domination is, like, hard. And bloody. Physical segregation is logistically impossible and usually implemented by means of genocide (see: Andrew Jackson’s “We didn’t genocide them. We just moved them to Oklahoma. Totally different.”). Internet-based segregation, you can do that if you want. Kind of lonely though.

So my practical recommendation for the individual is to just make peace with that your power to influence the world and others is very small. If you, reader, happen to be in charge of millions of dollars to spend on psyops, I have different recommendations, and ask that you email me so I can influence you.



This post first appeared on Inarticulate Xyz, please read the originial post: here

Share the post

Beliefs are safety-seeking

×

Subscribe to Inarticulate Xyz

Get updates delivered right to your inbox!

Thank you for your subscription

×