Sunday, 8 December 2024

The Cult's Loyalty Snare


One of the strangest things about humans is the phenomenon whereby cults often grow stronger, not weaker, after their beliefs, predictions, assumptions, and leaders all turn out to be wrong. We can justifiably hope to live in a world in which the opposite is true, but evidence shows we do not*. The primary reasons are the threefold old favourites of cognitive dissonance, sunk costs, and in-group tribal identity and association. 

Cognitive dissonance is well studied, and we know that people experience mental discomfort when confronted with evidence that contradicts their deeply held beliefs. In a cult, members invest a lot of emotional, social, and sometimes financial energy in their beliefs. So, when cults fail, they experience strong cognitive dissonance. People also feel more committed to beliefs and decisions as they invest more into them - whether it’s time, money, relationships, or personal identity – and after sinking those costs into the group, rejecting it and admitting they were wrong may feel like losing a heavy investment. To avoid this sense of loss, members might double down on their beliefs, and even intensify their involvement in the group, reinforcing their commitment and loyalty. 

Cults often provide a powerful sense of community and identity, which can be especially important for people who felt isolated before joining. When faced with a crisis of belief, members may cling to their group even more strongly to preserve their sense of belonging and identity. Leaving the group could mean losing this social support system and experiencing isolation, which many members would rather avoid.

You have to remember too, that whichever cult you’re in – Jehovah’s Witnesses, Scientology, Just Stop Oil, Answers in Genesis (to name but four) - the golden rule of cult deception is that you don’t know you’re in a cult. Cults often operate by encouraging members to distance themselves from non-believers, which limits access to outside opinions and evidence that might otherwise challenge their beliefs. This isolation makes it easier to control information and, in turn, to reinterpret or ignore contradictory evidence. Leaders can reinforce the idea that doubting or leaving the group would mean betrayal, disloyalty, or even punishment, which keeps members aligned even when beliefs are under strain.

And don’t forget too that, at this stage, cult members are riddled with confirmation bias - where members are so heavily primed to seek or interpret information in a way that confirms their existing beliefs, that they habitually interpret events in a way that upholds the underlying narrative, even if the outside world has consistently showed it to be wrong. I’ve actually seen documentaries where failures and refutations are enthusiastically reinterpreted as steps toward a grander, impending moment, which heightens members' commitment to prepare or sacrifice even more.

The paradoxical nature of belief reinforcement through disconfirmation – driven by these toxic combinations of cognitive dissonance, sunk cost fallacy, social isolation, and commitment to a group identity - must qualify as one of the strangest of all human phenomena.

And, alas, one of the most stultifying things about being in a cult is that cults manipulate from the top down in a way that starves the members of the freedom to fulfil their potential. Membership of a cult is predicated on the understanding that none of its members are freely encouraged to be the best they can be in life – instead they are merely encouraged to serve the cult’s agenda; a dynamic that can only usually occur through repression and indoctrination.

*See the studies of Festinger, Riecken, Schachter, Stark, Weber and Landes for further reading (and Asch’s infamous conformity experiments are interesting in the above context too).


No comments:

Post a Comment

/>