Jan 15, 2016

Cult Attraction is Not a Problem of Logic

Alexandra Steinn
Fair Observer
July 21, 2015

Alexandra Stein is an Associate Lecturer of Social Psychology at Birkbeck, University of London. She is a writer and educator specializing in the study of cults and ideological extremism. She is an ex-member of a political cult and has documented that experience in her book, "Inside Out." Stein offers prevention education programs and materials to help people understand how to identify and protect themselves from recruitment to cultic or extremist groups. More information is available at www.alexandrastein.com

Inside Out.What are cults, and how do they work?

Cults come in a great variety of forms: from the largely religious ones to terrorist groups that train suicide bombers; from right-wing to the ingrown “left” political groups that thrived in the 1970s and 1980s; and from get-rich-quick to personal growth groups. Although they are not all violent, they do share common features that enable them to exert extraordinary levels of control over their members.

The mechanisms that drive these groups are not as mysterious as we may think. Seventy years of study has been done to understand them.

Starting during the horrors of World War II and then Joseph Stalin and Mao Zedong’s totalitarian regimes, scholars did groundbreaking work to try understand the forces that produced extreme obedience to charismatic leaders. This period saw, among others, Hannah Arendt’s great work, The Origins of Totalitarianism; Stanley Milgram’s extraordinary experiments where ordinary people administered seemingly excruciating electrical shocks to strangers; and Robert Jay Lifton’s insightful work on brainwashing, crystallized in his Eight Criteria for Thought Reform.

Most recently, a new generation of scholars like myself have emerged who themselves have been victims of this process.

Although some scholars dismiss the concepts understood by the terms “cult” and “brainwashing,” these organizations and processes of extreme control have not abated.


Cultic or ideologically extremist groups are controlled by a leader or leadership group that is both charismatic and authoritarian. These leaders are psychopaths. Both charisma and authoritarianism are required as they are the source of the group’s central organizing dynamic of “love” and fear. Charisma alone is not sufficient.

Nelson Mandela was charismatic but not authoritarian. Jim Jones was both. The dual nature of the leader’s personality—charisma and authoritarianism—is the fundamental dynamic in the group. The leader needs charisma to appeal to followers, but at the same time, the leader’s authoritarian nature leads to actions that generate a feeling of fear, terror or threat. This is a potent mix that leads to control of followers. 


The inner structure of a cult is closed, isolating and steeply hierarchical. At the top sits the leader, whose every whim must be obeyed. Followers must renounce ties to outsiders—unless they can be recruited or used in some way. Yet within the group itself, belying the stereotype of close “community” that exists within cults, followers are, in important ways, isolated from each other, allowed to communicate only within the narrow confines of the group’s belief system.

The structure both isolates and engulfs. Followers are “pressed together” so tightly, as Arendt stated, that there is no privacy or personal space. The US Bible-based ATI cult is currently in the spotlight due to the fecund Duggar family, part of the Quiverfull movement. As with other cults, family relationships, sexuality and reproduction are tightly controlled in this movement, which is part of a powerful network of right-wing fundamentalists, with tentacles that reach into the highest echelons of the US government.

While close relationships in the group are closely controlled and monitored, on the other hand, if within-group relationships become too close, they will be broken up in order to prevent competing with the primary relationship to the leader or group as a whole. And woe betide the follower who expresses doubts, or worse, who leaves and criticizes the group—then, as for example with the Scientology disconnect policy—they are “fair game” for threats, intimidation and shunning. Or, as with many terrorist groups, the price of doubt is death.

While the inner structure is rigid and closed, looser front groups often exist in cults for recruitment, funding and influence purposes. They are “transmission belts” between the inner world of the cult and the rest of the world.


The closed structure is supported and represented by an exclusive belief system, also known as a total or extremist ideology. This all-encompassing belief system rejects all other points of view entirely, claiming to have the one truth that explains everything for all time. The structure of the ideology is arguably more important than any particular theological, political or other attributes.

The single truth is a reflection of the single point of power and control of the leadership, and it often changes at the leader’s whim. Lyndon LaRouche’s political cult, currently recruiting on US campuses and now under suspicion for the death of a young man from London, is a good example of this. He veered from a leftist Trotskyist stance early in his career to the right-wing, anti-Semitic position he now holds as head of the Worldwide LaRouche Youth Movement.

The cultic total ideology is also used to justify followers’ isolation, both from the outside world as well as from loved ones in the name of a higher commitment. Tim Guest quotes Bhagwan Rhajneesh, the leader of the cult he grew up in: “In a commune you will not be too attached to one family—there will be no family to be attached to.”

The totalist ideology encourages separating thinking from feeling—either you shouldn’t think (“be in your heart centre only”) or shouldn’t feel (“feelings are subjective”). This separation of thinking from feeling—dissociation—results in derailing a person’s ability to evaluate their situation.

However, you don’t get the whole ideology delivered all at once. There is a distinction between the seductive early propaganda fed to new recruits as opposed to the indoctrination—or brainwashing—process that happens later on.


Coercive persuasion or brainwashing are used to isolate followers and control them through a combined dynamic of “love” and fear. These processes take place within the isolating cultic structure and can lead to group members following the group’s orders, even when it puts their own interests or even their lives at risk.

Many isolating, weakening and influence strategies are used in this effort such as sleep deprivation, control of relationships, lack of privacy, control of information, diet and so on. Isolation, especially from very close relationships, as described above, is of particular importance.


These controlling processes, set in motion by a psychopathic leader within an isolating structure that is clothed in an absolute ideology, result in exploited, deployable followers.

Regardless of what the group may claim, the flow of resources in cultic groups moves upward to the leadership, typically in the form of money and other material assets, labor, sexual favors and uncritical obedience.

The leader’s fundamental motivation, however, is that of seeking power and control over others. While resources flow up, orders and ideology flow down to the followers.

Not all followers need to be controlled entirely, as long as they contribute in some way. Many groups have peripheral members who give money, time or other resources through front organizations. However, when consolidated in the group, most followers may demonstrate uncritical obedience, regardless of their own survival needs.

Islamic State (IS) suicide bombers are extreme and tragic cases of the utter loss of self-interest of these deployable agents, with, of course, terrible consequences for their victims. For example, we heard of 17-year-old Talha Asmal who died in a suicide bombing while fighting for IS, or news of the death of Thomas Evans, who was recruited by al-Shabab in 2011.
Recruitment Strategies

How do followers become controlled, and why don’t they just fight back or leave?

Let’s dig into this process of brainwashing. But before this, it is important to realize that there are two rather separate (though overlapping) processes that occur. The recruitment or obtaining of followers gets a person into the range of influence of the group, where retaining members is about creating loyal, obedient followers.

As Martha Crenshaw said, most people join terrorist groups by “accident, on their way to other goals.” The same can be said of most cult recruits, such as those recruited “off the street” or through friends and so forth.

There are “seekers”—those who are looking to join something (though no-one seeks to join a cult) and press-ganging, as happens to child soldiers. Having the bad luck to have parents in the group and being born or raised in it, or being born in a totalitarian state such as North Korea.

In the typical case of recruitment, the individual is recruited with an initially seductive come-on: attention, sometimes “love-bombing” and an appeal to some goal relevant to the recruit.

For example, the so-called “personal growth” cults promise to make you a better person, more effective, more “conscious.” There are endless versions of these, many making liberal use of Scientology-like “technologies.” In this phase, basic human tendencies to conform to group norms, comply with requests and obey instructions are exploited. Social psychologists have long demonstrated the power over ordinary people that these forms of social influence wield. But these processes are not sufficient to explain the uncritical obedience found in cults.

The process of retaining followers is really where the core of the brainwashing and control process takes place. While obtaining followers happens in a variety of ways, the retention process looks remarkably the same across diverse forms of cultic or extremist groups.

My own analysis relies on attachment theory, which is closely related to trauma theory. This theory states that an evolutionary adaptation fundamental to humans is the drive to seek proximity to others (initially as infants to caregivers), in order to gain protection from threat, thus improving chances for survival.

A child seeks its parent when ill, tired, frightened or in any other way under threat. The parent then functions as a safe haven for the child from whom they may gain protection and comfort. But once comforted, the child eventually wishes to explore its world again, and now the parent functions as a secure base, from which the child explores and to which they can return when protection and comfort is once again needed. Similar dynamics take place with adults in their very close relationships with spouses, partners or close friendships.

However, attachment relationships do not always function well. In particular, when the caregiver is not only the source of potential comfort, but is also the source of threat, a relationship of disorganized attachment results. Seeking comfort from the source of fear is a failing strategy: It not only brings the individual closer to the source of fear instead of escaping the threat, but it also fails to produce the required comfort, thus impeding a later exploration phase.

The person freezes—like a deer in the headlights. They are in a situation termed “fright without solution.” This failing attachment strategy causes dissociation and disorientation regarding the relationship in question: The individual is in a state of trauma and can no longer think clearly about his or her condition. We often see this dynamic in relationships of controlling domestic violence, in child abuse or in the Stockholm Syndrome where kidnap or hostage victims identify with their captors.


Within cultic groups, the isolation of followers from the outside world and from trusting relationships with others in the group leaves the group as the sole “safe haven” available to the follower.

There are many yoga and meditation cults whose ex-members I have worked with. You start by attending a yoga class and end with having to constantly meditate on the person of the leader. In one case, devotees are instructed to breathe in the female leader’s “golden light” with each breath. This leader eventually replaces all other relationships.

As involvement in the group increases, and outside involvements decrease, the group can then ramp up its demands. Part of this stage is also to induce fear or some other kind of threat. This can be fear of the outside world, fatigue, fear of some kind of apocalyptic event or any other form of threat.

In certain religious cults, stories of a wrathful God serve this purpose, while in the Lord’s Resistance Army rape and physical terror are used. Sometimes, simple exhaustion or bullying that one is not working hard enough at one’s “development” may be the sources of threat.

Once the follower is isolated, the arousal of fear causes them to turn to the group—their only remaining “safe haven”—to seek comfort and protection, even though it is the group itself that is causing the fear.

There are two effects. Emotionally, a strong attachment bond develops to the safe haven of the group. But as the fear arousal continues and the follower never attains comfort, they continue seeking closeness—this is the emotional glue, and it operates at a physiological level. Cognitively, the disorganized or traumatic bond, which creates a state of “freezing,” means the follower can no longer think about his or her feelings regarding the fear-inducing relationship.

The follower’s disoriented thoughts are colonized by the group: The group unhooks the follower’s perception of experience from their ability to think about what is happening and can now insert their own ideology and orders. The follower may now become a deployable agent and, with their own survival needs no longer in play, they can carry out the group’s orders.

It is in this context that those incomprehensible actions—such as suicide bombings—take place. As one former cult member told me, referring to her leader, “I remember feeling like I would take a bullet for Fred.”

Breaking Away

What can help to break the situation of “fright without solution” is alternate trusting or attachment relationships that allow an escape—a solution to the threat, which in turn allows the person to think clearly again, to reintegrate their thought processes. It is thus imperative that the cult prevents any such trusting relationships from developing.

This is why we can predict that cults will systematically attempt to interfere in followers’ close relationships and prevent access to information that reflects the true nature of the traumatizing relationship.

Numerous studies reject the idea that we can profile a typical recruit. And simply teaching “critical thinking,” the current idea popular in British universities—though a worthy goal in its own right—also isn’t sufficient. These are not problems of logic. They are problems of relationships: of grooming methods that result in recruits becoming isolated from their prior relationships, engulfed in the new isolating network and then subjected to high levels of arousal that create the trauma bond.

What we need to teach young people is precisely and specifically about the methods employed by cultic groups. From the ways they exploit universal human responses to various forms of social influence, to the vulnerability we all share when placed in situations of isolation from healthy sources of support. We need to teach people to recognize the difference between healthy relationships and dangerous ones.

We must create community-wide educational campaigns, similar to those currently in place about domestic abuse. The content of such campaigns should include warning signs of dangerous relationships—particularly regarding emotional and cognitive isolation.

This work should be taking place in universities, schools, communities and training programs. Former cult members are an invaluable resource in this training effort.

This problem, as we can clearly see, isn’t going away. Short-term solutions have not worked. We better get started now on an evidence-based, long-term, public health educational campaign that puts knowledge in the hands of all of us. As long as we remain ignorant, we all remain vulnerable.

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.


No comments: