We assume our beliefs are based on a truth. But with over 90% of people holding beliefs that can be classed as delusions, it could be time to reframe that assumption.
Beliefs form a personal guidebook to reality, telling us not just what is factually correct but also what is right and good, and hence how to behave towards one another and the natural world
“Everyone knows what belief is until you ask them to define it,” says Halligan. What is generally agreed is that belief is a bit like knowledge, but more personal. Knowing something is true is different from believing it to be true; knowledge is objective, but belief is subjective. It is this leap-of-faith aspect that gives belief its singular, and troublesome, character.
(For me Robin, a belief is an internally held representation of a lens through which we perceive reality)
Descartes thought understanding must come first; only once you have understood something can you weigh it up and decide whether to believe it or not. Spinoza didn’t agree. He claimed that to know something is to automatically believe it; only once you have believed something can you un-believe it. The difference may seem trivial but it has major implications for how belief works.
If you were designing a belief-acquisition system from scratch it would probably look like the Cartesian one. Spinoza’s view, on the other hand, seems implausible. If the default state of the human brain is to unthinkingly accept what we learn as true, then our common-sense understanding of beliefs as something we reason our way to, goes out of the window. Yet, strangely, the evidence seems to support Spinoza. For example, young children are extremely credulous, suggesting that the ability to doubt and reject requires more mental resources than acceptance. Similarly, fatigued or distracted people are more susceptible to persuasion. And when neuroscientists joined the party, their findings added weight to Spinoza’s view.
According to Langdon and others, what goes on in the normal process of belief formation, involves incoming information together with unconscious reflection on that information until a “feeling of rightness” arrives, and a belief is formed.
So where does the feeling of rightness come from? The evidence suggests that it has three main sources – our evolved psychology, personal biological differences and the society we keep.
The importance of evolved psychology is illuminated by perhaps the most important belief system of all: religion. Although the specifics vary widely, religious belief per se is remarkably similar across the board. Most religions feature a familiar cast of characters: supernatural agents, life after death, moral directives and answers to existential questions. Why do so many people believe such things so effortlessly?
According to the cognitive by-product theory of religion, their intuitive rightness springs from basic features of human cognition that evolved for other reasons. In particular, we tend to assume that agents cause events. A rustle in the undergrowth could be a predator or it could just be the wind, but it pays to err on the side of caution; our ancestors who assumed agency would have survived longer and had more offspring. Likewise, our psychology has evolved to seek out patterns because this was a useful survival strategy. During the dry season, for example, animals are likely to congregate by a water hole, so that’s where you should go hunting. Again, it pays for this system to be overactive.
This potent combination of hypersensitive “agenticity” and “patternicity” has produced a human brain that is primed to see agency and purpose everywhere. And agency and purpose are two of religion’s most important features – particularly the idea of an omnipotent but invisible agent that makes things happen and gives meaning to otherwise random events. In this way, humans are naturally receptive to religious claims, and when we first encounter them –typically as children – we unquestioningly accept them. There is a “feeling of rightness” about them that originates deep in our cognitive architecture.
According to Kreuger, all beliefs are acquired in a similar way. “Beliefs are on a spectrum but they all have the same quality. A belief is a belief.”
Our judgement of which ideas feel right to believe in is fallible(Image: Gordon Scammell/Loop Images/Corbis)
Our agent-seeking and pattern-seeking brain usually serves us well, but it also makes us susceptible to a wide range of weird and irrational beliefs, from the paranormal and supernatural to conspiracy theories, superstitions, extremism and magical thinking. And our evolved psychology underpins other beliefs too, including dualism – viewing the mind and body as separate entities – and a natural tendency to believe that the group we belong to is superior to others.
A second source of rightness is more personal. When it comes to something like political belief, the assumption has been that we reason our way to a particular stance. But, over the past decade or so, it has become clear that political belief is rooted in our basic biology. Conservatives, for example, generally react more fearfully than liberals to threatening images, scoring higher on measures of arousal such as skin conductance and eye-blink rate. This suggests they perceive the world as a more dangerous place and perhaps goes some way to explaining their stance on issues like law and order and national security.
Another biological reflex that has been implicated in political belief is disgust. As a general rule, conservatives are more easily disgusted by stimuli like fart smells and rubbish. And disgust tends to make people of all political persuasions more averse to morally suspect behaviour, though the response is stronger in conservatives. This has been proposed as an explanation for differences of opinion over important issues such as gay marriage and illegal immigration. Conservatives often feel strong revulsion at these violations of the status quo and so judge them to be morally unacceptable. Liberals are less easily disgusted and less likely to judge them so harshly.
Different realities
These instinctive responses are so influential that people with different political beliefs literally come to inhabit different realities. Many studies have found that people’s beliefs about controversial issues align with their moral position on it. Supporters of capital punishment, for example, often claim that it deters crime and rarely leads to the execution of innocent people; opponents say the opposite.
That might simply be because we reason our way to our moral positions, weighing up the facts at our disposal before reaching a conclusion. But there is a large and growing body of evidence to suggest that belief works the other way. First we stake out our moral positions, and then mould the facts to fit.
So if our moral positions guide our factual beliefs, where do morals come from? The short answer: not your brain.
According to Jonathan Haidt at the University of Virginia, our moral judgements are usually rapid and intuitive; people jump to conclusions and only later come up with reasons to justify their decision. To see this in action, try confronting someone with a situation that is offensive but harmless, such as using their national flag to clean a toilet. Most will insist this is wrong but fail to come up with a rationale, and fall back on statements like “I can’t explain it, I just know it’s wrong”.
This becomes clear when you ask people questions that include both a moral and factual element, such as: “Is forceful interrogation of terrorist suspects morally wrong, even when it produces useful information?” or “Is distributing condoms as part of a sex-education programme morally wrong, even when it reduces rates of teenage pregnancy and STDs? People who answer “yes” to such questions are also likely to dispute the facts, or produce their own alternative facts to support their belief. Opponents of condom distribution, for example, often state that condoms don’t work so distributing them won’t do any good anyway.
What feels right to believe is also powerfully shaped by the culture we grow up in. Many of our fundamental beliefs are formed during childhood. According to Kreuger, the process begins as soon as we are born, based initially on sensory perception – that objects fall downwards, for example – and later expands to more abstract ideas and propositions. Not surprisingly, the outcome depends on the beliefs you encounter. “We are social beings. Beliefs are learned from the people you are closest to,” says Kreuger. It couldn’t be any other way. If we all had to construct a belief system from scratch based on direct experience, we wouldn’t get very far.
This isn’t simply about proximity; it is also about belonging. Our social nature means that we adopt beliefs as badges of cultural identity. This is often seen with hot-potato issues, where belonging to the right tribe can be more important than being on the right side of the evidence. Acceptance of climate change, for example, has become a shibboleth in the US – conservatives on one side, liberals on the other. Evolution, vaccination and others are similarly divisive issues.
So, what we come to believe is shaped to a large extent by our culture, biology and psychology. By the time we reach adulthood, we tend to have a relatively coherent and resilient set of beliefs that stay with us for the rest of our lives (see “Your five core beliefs”). These form an interconnected belief system with a relatively high level of internal consistency. But the idea that this is the product of rational, conscious choices is highly debatable. “If I’m totally honest I didn’t really choose my beliefs: I discover I have them,” says Halligan. “I sometimes reflect upon them, but I struggle to look back and say, what was the genesis of this belief?”
Forget the facts
The upshot of all this is that our personal guidebook of beliefs is both built on sand and also highly resistant to change. “If you hear a new thing, you try to fit it in with your current beliefs,” says Halligan. That often means going to great lengths to reject something that contradicts your position, or seeking out further information to confirm what you already believe.
That’s not to say that people’s beliefs cannot change. Presented with enough contradictory information, we can and do change our minds. Many atheists, for example, reason their way to irreligion. Often, though, rationality doesn’t even triumph here. Instead, we are more likely to change our beliefs in response to a compelling moral argument – and when we do, we reshape the facts to fit with our new belief. More often than not, though, we simply cling to our beliefs.
All told, the uncomfortable conclusion is that some if not all of our fundamental beliefs about the world are based not on facts and reason – or even misinformation – but on gut feelings that arise from our evolved psychology, basic biology and culture. The results of this are plain to see: political deadlock, religious strife, evidence-free policy-making and a bottomless pit of mumbo jumbo. Even worse, the deep roots of our troubles are largely invisible to us. “If you hold a belief, by definition you hold it to be true,” says Halligan. “Can you step outside your beliefs? I’m not sure you’d be capable.”
The world would be a boring place if we all believed the same things. But it would surely be a better one if we all stopped believing in our beliefs quite so strongly.
One thought on “How important are your beliefs to the reality of being You?”