Monday, 25 September 2023

Church Set-Up & Varying Human Beliefs

 

When on the church set up team, one of our jobs is to put down the mats for the kids to play on. As you can see from the picture I took above, there are four colours, and for aesthetic effect we endeavour to place the mats so that no two adjacent mats are the same colour. There is actually a mathematical precedent for this - it's called the four colour theorem, and it states that no more than four colours are required to colour the regions of any map so that no two adjacent regions have the same colour.

The four colour theorem has been proven, although the heuristic for the proof is fairly long-winded. This is in contrast to something like the Twin Primes Conjecture (there are infinitely many primes p for which p+2 is also prime), where it has not yet been proven, but yet the heuristic makes the explanation fairly tractable. Then there's Gödel incompleteness theorem (G(F)), which shows that any finite system of axioms is insufficient for proving every result in mathematics, and that any formally mechanised system in which a categorical set of axioms exists cannot be fully captured without leaving a state of incompleteness.

These varying mathematical propositions gave me pause to think about commentary I frequently make on human behaviour, on smart and dumb thinking, and on how certain beliefs are embraced wantonly, even when they are obvious nonsense. This ties in with the other reliable rule of thumb too, that the amount of effort needed to refute incorrect thinking is an order of magnitude bigger than that needed to come up with it. There is a complex relationship between the beliefs themselves, the reasons people believe them, and the motivations they have for not questioning them very rigorously (or often not at all).

I think there are three important things to bring to attention about the formation of beliefs. The first is that most of your beliefs are likely to be partially wrong or faulty - the chance that you've thought everything through with precise rigour and without cognitive error is infinitesimally low. The second is that the number of possible wrong interpretations of reality we can produce is immeasurably large, whereas the number of possible right interpretations of reality we can produce is vanishingly small in comparison. And the third is that you have very strong emotional, psychological, cultural and tribal incentives for your beliefs - and the chances that you've managed to address them comprehensively and honestly, without any biases distorting the picture for you, is practically zero.

Consequently, the formation of rational and correct beliefs and viewpoints involve a big ask; they involve precise, careful and attentive reasoning and truthseeking; they involve understanding how complex the search space is; they involve the ability to sift through a lot of data and select the right beliefs from an astronomically larger number of incorrect beliefs; and they involve the willingness to discern and confront our own intellectual biases and emotional prejudices and filter them out as best we can to leave us in support of the bare truth and plain facts. That is a gigantic ask.

And, assuming a willingness to seek the truth honestly, we find that, like the mathematical examples above; in some cases it is hard to sense precisely why something is wrong, but not that hard to determine the steps needed to arrive at the corrective; and in some cases it's easier to sense why something is wrong but harder to determine the steps needed to arrive at the corrective. This is because human error is riddled with complex fault narratives, ranging from intellectual ineptitude, scarcity of knowledge, emotional biases, tribal pressures, paths of least resistance, indolence, weak support networks, and other competing trade offs that compromise the success of the task.

This means the mind is fighting to reconcile a number of foggy inner conflicts; sensing a belief is wrong but lacking the incentive to find out why; finding it convenient to hold a dubious viewpoint because it lessens burdens elsewhere; making assumptions based on outsourcing your thinking to people whose views subliminally serve your own interests well; compromising the truth in an attempt to attain status within an assortative mating pool within sexual selection; assenting to a black and white mode of thinking because it helps you manage negative emotions - the list goes on.

Given all the above, here is some guidance you may find useful. If you hold a view and you suspect deep down that it may be wrong, ask yourself a candid question. Do I really care about being right on this matter? In other words, am I prepared to pay the costs of trying to understand why I'm wrong, what the correct alternative is, and the costs of changing my mind (there may be social, familial, cultural and professional costs of a public change of mind). The vast majority of people who are wrong on a particular issue don't really care about being right. If they were prepared to pay the costs to be right, they usually would have done so by now.

If the answer to the question "Do I really care about being right on this matter?"  is yes - and you'd better pray it is yes - try analysing why you believe this wrong view; think about how you arrived at this belief; think about the incentives of those who also claim to believe this wrong view; think about what experts and others who disagree with you believe about this matter; think about how much you understand the counter arguments to your belief; think about whether this belief makes you feel good about yourself or uncomfortable; think about what would convince you your view is wrong; think about what your best arguments for the opposing side would be if you had to argue for their case in a court-type of scenario.

If you care about being right on a particular matter, and you subject yourself to the above questions and are prepared to go where the indications lead you, there's a decent chance that you will arrive at the right viewpoint. Two words of warning, though. First, if you sense you're not willing to subject all your views to this level of scrutiny, you are probably not taking things seriously enough. Second, if you hold a wrong view and you have no inkling that it's wrong, you need to locate the means by which you might possibly recognise the view as wrong. This can be difficult because the fog that clouds vision can often feel like the vision if you don't sense the clear horizon in the distance. Being wrong but not knowing it, doesn't feel much different to being right; being closed-minded doesn't always feel like close-mindedness; and cognitive errors don't always feel like defects, especially when they are enslaved by passion for a cause.

No comments:

Post a Comment

/>