Written by Steven Van Neste
Yes, the title is in jest but only partially so, for though each and every one of us could be dragged into a web of undue influence, the more we are aware of certain things, the more resilient we may become to cultic snares. As tends to—rightfully so—be stated by almost everyone researching cults, it is not something one joins knowingly, yet at the same time it is also not quite true that the individual is wholly innocent, after all, we do join something. The fish is reeled in because he has a hook in his mouth. The fish has a hook in his mouth because he took the bait. The reason the fish gets caught is because of his hunger, and likewise, the reason one may become trapped in a cult or a victim of undue influence is because of needs we seek satisfaction for. All of us have needs and hence we all are susceptible to being taken in by either a group or individual seeking to cozen us. The fish gets caught because the food he wished to satiate himself with had a hook in it, likewise in our human society what we must be on the lookout for is hooks. We all seek happiness; we all wish to belong. Even the loner is not free from undue influence, on the contrary, extreme loneliness may very well make one more likely to fall prey to indoctrination. Timothy McVeigh—who committed the 1995 Oklahoma City bombing—was a loner who is said to have become indoctrinated through extreme right cultic literature (Galanter, 1989). At the time of the Oklahoma City bombing, the internet was young, whereas now is something most of us carry with us most of the time. The often-cold anonymity of Cyberia is fertile ground for all sorts of propaganda and indoctrination and is rife with all sorts of conspiracy theories and so-called alternative facts. Perhaps more important is that when we become too estranged from our fellow human beings, we stand to lose valuable interpersonal input, which may ultimately lead to a weakening of self-integrity.
The possibility of cultic entanglement exists in all of us, as was remarked by the late Benjamin Zablocki; everyone can be brainwashed, it merely requires the right circumstances. It must be noted here that Zablocki’s use of the term ‘brainwashing’ differs from what most have in mind, especially since the layperson tends to think of brainwashing as something that traps a person in a cult, whereas this is not so, since brainwashing is what keeps members inside, it only occurs to those who already are trapped in a cult (Zablocki, 2001 & 2016). It is, however, not just that each person can be brainwashed but also that everyone can be pulled inside a situation where such a thing could become a reality. Hannah Arendt’s famous term “the banality of evil”, is not just about extremes but can also be applied to seemingly more innocent movements. When Arendt coined this term in 1963, she received a lot of backlash from people who did not want to accept that a Nazi war criminal such as Adolf Eichmann could be considered a normal person who was merely driven to his actions through an intricate network of bureaucracy and obedience. One person who strongly agreed with Arendt, however, is Stanley Milgram whose landmark experiments showed that not only does everyone indeed seem capable of blindly following authority, but some of us even go very deep in said obedience. The sole reason we imagine it cannot happen to us is that we are not in that situation, because we are sitting back in our chair in our lives of ordinary society and morality, and we cannot understand what it would be like. Most of us like to think of ourselves as good and we would not deliberately want to hurt another person, yet this is precisely what happened in Milgram’s experiments. Even more so, when afterwards the individuals were debriefed, they often stated they did not want to do it, they did not feel comfortable with it, but they kept on going because it is what was expected because an authority said it was ok, and they felt the responsibility was wholly with authority (Milgram, 1974). Similar findings to those of Milgram were found by Philip Zimbardo through his notorious Stanford Prison experiment, where not only did seemingly normal college kids become cruel and depraved prison guards but did so at an alarmingly fast pace, the experiment was set to run for two weeks, but was abandoned after merely six days after having run completely out of control.
It is easy to judge from within our ordinary environment and this is perhaps the greatest danger in possibly becoming a victim of undue influence, as generally, man tends to overestimate his personal mental strength and security. The other fact we are negligent about is that different situations can drastically alter brain states, hence when we judge we do so because we falsely imagine our mind would be the same as it is now. More so, going back to what was mentioned already, nobody joins a cult, yet something is joined; what creates this dichotomy is that all this happens largely unconscious, so what we must be aware of is that when we think it could not happen to me, that this thinking comes from the wrong assumption joining a cult is a conscious decision. Undue influence operates almost entirely via unconscious processes, hence if we wish to have somewhat of a safeguard against undue influence, then we ought to stop thinking about this from a conscious perspective. Central to understanding this is the work Daniel Kahneman (2011), and his ramification of thinking processes into fast and slow. When a person is lured into a cult then this is through fast processes, as these are largely unconscious and involve less activity in the prefrontal cortex.
The most common mistake made—especially by the general public—is that people who fall prey to cults do so because of underlying mental issues or weaknesses, this however is simply not the case. One falls victim to a cult not because of a broken mind but because of a situation. As I have written before concerning the Peoples Temple (Van Neste, 2016) people were lured in through a sense of good-will, they were not people radically different from you or me, but merely people who through the particularity of their individual situation became vulnerable to being recruited in a cult. It is not that fast thinking processes themselves are bad, to the contrary, cognitive resources are limited, which is why intense slow thinking processes lead to cognitive strain; but we must never rely on them exclusively, and become flexible enough to cogitate using slow processes when the need arises. Another common misconception may arise here, namely that one is the individual who would never find oneself in such a situation, but this again is fallacious, as it simply is impossible to foretell the situational, let alone what our brain-mind state would be. One may think of oneself as being a smart and conscientious grocery shopper, but then one day the occasion arises where one goes to a brand-new grocery store and on top of that feeling rather peckish. Similarly, for instance, a person who leads a genuinely happy and fulfilled life may find himself in a state of depression following a bereavement and fall prey to undue influence operating on the promise of recovery and happiness. A cult also does not need to be extreme or operate on some radical principle, anything has the potential for undue influence. Even seemingly innocent groups offering drug and alcohol rehabilitation often hide a darker underside, not necessarily in the sense of harbouring a plan for something malicious, but in that they cause an indoctrination of certain principles unrelated to the victim signed up for. Alcoholics Anonymous is a good example of this (see Galanter, 1989 & Peele, 2014), since for all the supposed good they seek to do, they also operate under cultic principles. A person who joins Alcoholics Anonymous may initially find the idea of a higher power ridiculous yet soon becomes an ardent follower; even more so the deeper he gets in his sobriety, the more he stops to question the validity of the steps and operating principles.
There is a lot of wisdom in the common advice to always be aware of one’s surroundings, to this, however, we may add one should also be aware of one’s own cognitive and emotional states. Fast-thinking processes often depend on emotion and bias, as such the irony is that the same quickness whereby we feel certain we could never fall prey to undue influence is the same quickness whereby we might fall prey. We may feel strongly about our conscious selves but most of our cognitive processes happen on an unconscious level, to which we may also add that we tend to over evaluate our own conscious performance and self-integrity (see for instance Wilson, 2002). How often do we really stop to analyze our conscious performance? Where does a belief come from, or what causes a certain behaviour? Through experiments it has long been known that often an action ascribed to our conscious mind was in fact initiated before the possibility of consciousness, so often our consciousness lies not in a decision made but rather in our observation of it. Likewise, we also tend to be too certain of our emotional state, especially in the sense that we often lack emotional honesty. The tendency to keep face in public settings often moves into our private lives as well, because we like to be okay, we often will pretend we are so even to our own selves. Many a stereotype of a mood disorder is just that, in reality, our moods often are hidden even from our own selves, and it is this lack of conscious honesty which makes us all the more vulnerable to situations that may act upon our mood. As was said by Joost Meerloo (1949): “How much truth can a man bear? How much truth about himself can he bear? Does he have the courage to penetrate ever more deeply into reality?”
If we wish to secure ourselves from cults and undue influence, then we must first be daring to be honest with our own selves and vulnerabilities. The situational always depends upon emotion rather than intelligence, we admonish our kids because they fall into situations of peer pressure because they failed to think for themselves, yet as adults, we are not much better off. Aside from some of the more eccentric individuals perpetrating the 2021 storming of the US Capitol, many of them would have been considered more or less well-adjusted individuals, but same as with other terrorists, they perhaps had trouble coping with an ever-expanding world and in order to boost their ego—and find a new sense of self-worth and authenticity—found a cause to which they could “religiously” devote themselves to (see also: Stern, 2003). People who find themselves marginalized often have a greater chance of falling prey to undue influence, yet it is important to know this does not necessarily mean people who feel poorly about themselves, as it may also very well mean people who are under a spell of excessive self-importance. Similarly, those who tend to have an excessive tendency to think outside the box also may be at a higher risk, as they are more prone to confabulation and surrender to alternative figures of authority. The key takeaway here is that one does not fall prey to a cult because one lacks thinking, but rather because of improper thinking, which when combined with certain emotional states and situational elements may form a recipe for disaster.
That everyone may fall prey to undue influence does not mean we cannot try to safeguard ourselves as much as possible. The two main contributors to cultic involvement are our inherent need to belong and be accepted, and interpersonal friction. A cult may arise through epochal specificity—as a reaction to socioeconomic movement whereby the individual feels out of touch with the world in which he was brought up—or through a general sense of malaise and ennui. When we have trouble accepting and functioning in an ever-changing and evolving reality, then our sense of self may become distorted. It is important we learn to keep ourselves in check, an idea perhaps best expressed by the ancient Greek virtue of sophrosyne which translates to self-moderation, and the etymology of which means to be sound of mind or even to be safe inside one’s own mind; this idea was also present at the birth of modern philosophy, as one of the maxims from which Rene Descartes operated was to as much as possible keep one’s opinions moderate (Discourse On Method). This does not mean, however, that one is not allowed to think differently, but rather that we should not let differences become agents of excessive reactionism. Self-moderation may furthermore be a great way of tempering one’s need to self-assertion, this is especially needed in a society that becomes more and more driven by utility and material fulfilment. As found in the word work of Mordechai Rotenberg (2015), the more we set up the other as competition, the harder it will be to find meaningful interpersonal relationships; Rotenberg’s answer—rooted in his blending of psychology with aspects more particular to Jewish thought and mysticism—lies in a metaphoric contracting of one’s own self in order to leave room for otherness.
In conclusion, we may never know in advance when we might encounter a situation that might leave us vulnerable to undue influence, but we may minimize such risk by relying less on excessive self-certainty and focusing more on self-awareness, through mindfulness and introspection, by questioning our thoughts, beliefs and most importantly, our emotions. In a world that is ever-evolving, the way to cope is not by moving back into some nostalgic fantasyland, but by embracing the wide-open future through resilience, and by allowing ourselves to move into what Robert Lifton (1993) calls “the Protean Self”—as he more recently reiterated: “the protean self is a way of adapting to significant historical dislocation and change. It is at the same time an expression of the basic human need for symbolization” (2019). The essence for Lifton is that cultism derives from man’s inability to cope with societal changes and hence man seeks comfort in a narrowing of symbolism; Proteanism, on the other hand, involves a widening of symbolism, leading towards a more multi-faceted self that feels more at ease in a society of multi-cultural growth. That the Protean Self is more adaptable means it can come to stand in greater authenticity, since it no longer needs to aggressively assert itself against what the narrow self fears, and the more we may move away from such fear, the more we may find certain level-headed tranquillity and soundness of mind.
Galanter, Marc (1989) – Cults: Faith, Healing, and Coercion
Kahneman, Daniel (2011) – Thinking, Fast and Slow
Lifton, Robert (1993) – The Protean Self
Lifton, Robert (2019) – Losing Reality
Meerloo, Joost (1949) – Delusion and Mass Delusion
Milgram, Stanley (1974) – Obedience to Authority
Peele, Stanton (2014) – Recover!
Rotenberg, Mordechai (2015) – The Psychology of Tzimtzum
Stern, Jessica (2003) – Terror in the Name of God
Van Neste, Steven (2016) – Jonestown & Humanity (in Alternative Considerations of Jonestown & Peoples Temple, web resource: https://jonestown.sdsu.edu/?page_id=67414)
Wilson, Timothy (2002) – Strangers to Ourselves
Zablocki, Ben (2001) – Towards a Demystified and Disinterested Scientific Theory of Brainwashing (in Misunderstanding Cults, edited by Benjamin Zablocki and Thomas Robbins)
Zablocki, Ben (2016) – An Interview with Benjamin Zablocki (in Ethics and the Modern Guru: Peoples Temple Edition)