
While beliefs are a necessary element of healthy cognition, rigid beliefs are the basis of some of the most damaging problems faced by humans both on an individual and on a social level. Like dictators, once an idea enters into the control room of our minds, it is very difficult to force its abdication, like Francis Bacon observed.
The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side (…) may remain inviolate. (…) [T]he first conclusion colors and brings into conformity with itself all that come after, though far sounder and better.
Francis Bacon, New Organon, or true directions concerning the interpretation of nature (English translation, based on the 1863 translation of James Spedding, Robert Leslie Ellis, and Douglas Denon Heath)
Not all beliefs are equally resilient, and its resilience can change over time. Even notorious conspiracy beliefs can be destabilized by exposure to counterevidence, especially when such evidence is presented by a source that is perceived as trustworthy. Yet, evidence may often have surprisingly little impact.
Some of us (how many? Who knows, but for sure the author of this blog) think (believe) that people should be free to believe what they want, but also that our ideas (images, metaphors, beliefs) fundamentally constraints and defines our lives.
In a paper1 published in PNAS last week, Martin Scheffer et. al. Recognize that beliefs are shaped and maintained by an ongoing interplay of dynamical processes and argue that one obvious way to understand beliefs is to see them as “attractors” in the sense of dynamical systems theory.
We assume that belief is a saturating function of the perceived evidence-for, and that disbelief (negative belief) saturates with perceived evidence-against. The result is a sigmoidal curve representing belief strength as a function of perceived cumulative evidence (A).
Depending on the strength of the confirmation bias, an objectively neutral package of evidence might be turned into evidence for or against, depending on the existing belief (B).
Even if objective evidence changes, this will usually only have minor effects on the perceived evidence and, therefore, on the belief. However, if the cumulative objective evidence changes strongly enough for the unstable equilibrium to touch the belief equilibrium, a tipping point (T2) is reached where stability of the belief is destroyed, and the belief will be abandoned. Analogously, in T1, the disbelief becomes unstable.(D).
In short, two elements are central when it comes to the question of what we can do about “harmful” beliefs:
- Reducing the resilience of specific harmful beliefs requires sustained exposure to counterevidence, which typically requires organized rational override.
- Reducing rigidity of beliefs in general can be achieved through improving education and reducing social stress.
Just keep your ideas (beliefs) always in check… there (might) be dragons inside your head.
____________________
(1) Scheffer, Marten, Denny Borsboom, Sander Nieuwenhuis, and Frances Westley. ‘Belief Traps: Tackling the Inertia of Harmful Beliefs’. Proceedings of the National Academy of Sciences 119, no. 32 (9 August 2022): e2203149119. https://doi.org/10.1073/pnas.2203149119.
Featured Image: Sergio Albiac, Laws of Attractor