Showing posts with label Michael Langone. Show all posts
Showing posts with label Michael Langone. Show all posts

Jun 4, 2025

CultNEWS101 Articles: 6/4/2025 (Podcast, Prem Rawat, Neuroscience, Michael Langone, Book)


Podcast, Prem Rawat, Neuroscience, Michael Langone, Book

"Don Johnson interviews Paul Drescher, an ex-follower of Prem Rawat.  We discuss our experiences as young men in what we now know was and still is a cult."

Summary: New research challenges many widely held beliefs in psychology, revealing that genetics may play a greater role in shaping personality than parenting. The findings also dispute common assumptions about gender-based personality differences, the power of subliminal messaging, and the effectiveness of brain training.

Misconceptions about mental illness are also addressed, emphasizing that mental health conditions arise from complex genetic, social, and environmental factors rather than life events alone. The work calls for critical thinking, skepticism toward oversimplified media portrayals, and higher standards for psychological research and classification systems.

Key Facts:
  • Genetics Over Parenting: Evidence shows genetics may outweigh parenting in influencing adult personality.
  • Mental Illness Complexity: Disorders stem from combined genetic, environmental, and social factors.
  • Call for Reform: Greater research transparency and skepticism toward media-driven psychology myths are needed.
This book may appeal to anyone interested in philosophy, psychology, Christianity and science fiction, especially if all four. There's something for everyone. "Called by Name: Birth of a New Christendom" is self-published, by Dr. Michael David Langone. Here's his bio on his website:

"Michael received his PhD in Counseling Psychology from the University of California, Santa Barbara in 1979. In 1980 he began working with the International Cultic Studies Association (ICSA - then called the American Family Foundation), which he served as executive director for many years until his retirement in January 2023. Over the years he has counseled or consulted with more than a thousand individuals affected by cultic involvements and/or spiritually abusive relationships. His three areas of intellectual interests converged in his cultic studies work: psychology, religion, and philosophy."

Religion is an element of many science fiction works. In this one it's central to the story. His website has a page with a brief description of the book's plot and a page with a longer explanation of how the book came about, which gives you an even better idea of how the book tackles those subjects. 



News, Education, Intervention, Recovery


CultMediation.com   

Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.

CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.

CultNEWS101.com news, links, resources about: cults, cultic groups, abusive relationships, movements, religions, political organizations, and related topics.

Facebook

Flipboard

Twitter

Instagram



The selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view to promote dialogue.


Feb 23, 2022

ICSA Annual Conference: Assessment of perceptions and experiences of family members or individuals concerned about a loved one who is or was in a controlling or abusive group or relationship

Assessment of perceptions and experiences of family members or individuals concerned about a loved one who is or was in a controlling or abusive group or relationship.
Assessment of perceptions and experiences of family members or individuals concerned about a loved one who is or was in a controlling or abusive group or relationship.(Panel Part 1/2)

Carmen Almendros, Michael Langone


ICSA Annual Conference

Friday, June 24th

1:00 PM - 1:50 PM (Panel Part 1)

2:00 PM - 2:50 PM (Panel Part 2)

 



"Psychological abuse within cultic groups is a worldwide social problem that has negative impacts on the health of victims-survivors, families and communities. A growing body of research has evidenced the manipulative and abusive practices endured by many followers of these groups and their deleterious effects on members and former members’ wellbeing. Despite progress here, little is known about how the involvement and/or ongoing membership of a loved person to such groups affects their family members or friends. In fact, the lack of study of the experiences and problems faced by families and relatives of victims-survivors seems to be a common research gap within other areas in which coercive controlling relationships occur (e.g., intimate partner violence). To address these gaps we conducted a study to examine the concerns, responses and experiences of family members, relatives and friends of members and former members of controlling and/or abusive groups or relationships. The initial sample of the study consisted of 230 participants who were/had been concerned over a current or past involvement of a loved one in one of such groups/relationships. Some of the participants were themselves survivors of the same groups/relationships (around 30% had been born and/or raised in such). Results showed that responses seemed very comparable with those obtained when studying family members of people with other problems (generally a diagnosis of a mental health problem) in terms of family distress and emotional experiences of caring. Understanding the difficulties and problems faced by family members and their coping responses may not only give visibility to the suffering of these close relatives of victims/survivors of abusive relationships, as well as evidence the scarcity of useful helping resources, but may contribute to inform prevention and intervention efforts on this crucial societal problem."


Carmen Almendros

Carmen Almendros

Profesora Titular

Universidad Autónoma de Madrid

Carmen Almendros, PhD, is Associate Professor in the Biological and Health Psychology Department at the Universidad Autónoma de Madrid, Spain. She is on ICSA’s Board of Directors, and is International Journal of Cultic Studies, Co-Editor. She published a book and several articles on psychological abuse in group contexts, cult involvement, leaving cults, and psychological consequences of abusive group membership. Her research interests also include the study of parental discipline and psychological violence in partner relationships. She is principal researcher of a project entitled: Coercive control as a differentiating element of violent dynamics in youth relationships: an intermethod and longitudinal study, financed by the Spanish Ministerio de Ciencia, Innovación y Universidades. She was the 2005 recipient of ICSA’s Margaret Singer Award, given in honor of her research into the development of measures relevant to cultic studies.



Michael Langone
Michael Langone

Executive Director

International Cultic Studies Association

Michael D. Langone, PhD, received a doctorate in Counseling Psychology from the University of California, Santa Barbara in 1979. Since 1981 he has been Executive Director of International Cultic Studies Association (ICSA), a tax-exempt research and educational organization concerned about psychological manipulation and cultic groups. Dr. Langone has been consulted by several hundred former cult members and/or their families. He was the founder editor of Cultic Studies Journal (CSJ), the editor of CSJ’s successor, Cultic Studies Review, and editor of Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse (an alternate of the Behavioral Science Book Service). He is co-author of Cults: What Parents Should Know and Satanism and Occult-Related Violence: What You

Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse
Should Know. Currently, Dr. Langone is ICSA Today’s Editor-in-Chief. He has been the chief designer and coordinator of ICSA’s conferences, which in recent years have taken place in Bordeaux, Stockholm, Trieste, Barcelona, New York, Montreal, Rome, Philadelphia, Geneva, Denver, Brussels, Atlanta, Edmonton, and Madrid. In 1995, he was honored as the Albert V. Danielsen Visiting Scholar at Boston University. He has authored numerous articles in professional journals and books, including Psychiatric Annals, Business and Society Review, Sette e Religioni (an Italian periodical), Grupos Totalitarios y Sectarismo: Ponencias del II Congreso Internacional (the proceedings of an international congress on cults in Barcelona, Spain), Innovations in Clinical Practice: A Sourcebook, Handbook of Psychiatric Consultation with Children and Youth, Psychiatric News, and all of ICSA’s periodicals. Dr. Langone has spoken widely to dozens of lay and professional groups, including the Society for the Scientific Study of Religion American Association for the Advancement of Science, Pacific Division, American Group Psychotherapy Association, American Psychological Association, the Carrier Foundation, various university audiences, and numerous radio and television stations, including the MacNeil/Lehrer News Hour and ABC 20/20. He is also co-editor of

ICSA's Cult Recovery: A Clinician's Guide to Working With Former Members and Families
ICSA's Cult Recovery: A Clinician's Guide to Working With Former Members and Families, published in 2017.


Register: https://whova.com/web/icsaa_202207/


Jan 26, 2022

Book Launch – Radical Transformations in Minority Religions

Inform online launch of "Radical Transformations in Minority Religions", edited by Beth Singler and Eileen Barker

    February 10, 2022
    5:30  -  7:30 pm GMT (London, UK)
    Via Zoom
    Register


About the book:

All religions undergo continuous change, but minority religions tend to be less anchored in their ways than mainstream, traditional religions. This volume examines radical transformations undergone by a variety of minority religions, including the Children of God/ Family International; Gnosticism; Jediism; various manifestations of Paganism; LGBT Muslim groups; the Plymouth Brethren; Santa Muerte; and Satanism. 

As with other books in the Routledge/Inform series, the contributors approach the subject from a wide range of perspectives: professional scholars include legal experts and sociologists specialising in new religious movements, but there are also chapters from those who have experienced a personal involvement. The volume is divided into four thematic parts that focus on different impetuses for radical change: interactions with society, technology and institutions, efforts at legitimation, and new revelations. 

This book will be a useful source of information for social scientists, historians, theologians and other scholars with an interest in social change, minority religions and ‘cults’. It will also be of interest to a wider readership including lawyers, journalists, theologians and members of the general public.


Respondents will include

Register:

  • To register please make a donation via Paypal at https://inform.ac/upcoming-events/
  • A link to the event will be sent to the email address associated with your PayPal account. 
    • Note: If you cannot make a donation at this time, please email Inform@kcl.ac.uk to register. 

 

For more information on "Radical Transformations in Minority Religions":


Table of Contents

Part One: Internal Forces Leading to Radical Changes

  1. Radical Changes in Minority Religions: Reflections - Beth Singler
  2. What Did They Do About It? A Sociological Perspective on Reactions to Child Sexual Abuse in Three New Religions - Eileen Barker
  3. Children of Heimdall: Ásatrú Ideas of Ancestry - Karl E. H. Seigfried
  4. Varieties of Enlightenment: Revisions in the EnlightenNext Movement around Andrew Cohen - André Van Der Braak
  5. "Not all Druids wear robes" - Countercultural Experiences of Youth and the Revision of Ritual in British Druidry - Jonathan Woolley
  6.  

    Part Two: Technology and Institutions as Drivers of Change

  7. Santo Daime: Work in Progress - Andrew Dawson
  8. A Song of Wood and Water: The Ecofeminist Turn in 1970s-1980s British Paganism - Shai Feraro
  9. When Galaxies Collide: The Question of Jediism’s Revisionism in the Face of Corporate Buyouts and Mythos ‘Retconning’ - Beth Singler
  10.  

    Part Three: Change as a Part of a Process of Legitimation

  11. Regulating Religious Diversification: A Legal Perspective - Frank Cranmer And Russell Sandberg
  12. Revision or Re-Branding? The Plymouth Brethren Christian Church in Australia under Bruce D. Hales 2002-2016 - Bernard Doherty And Laura Dyason
  13. Appendix to Revision or Re-Branding? The Plymouth Brethren Christian Church 2002-2016 - PBCC
  14. Diversification in Samael Aun Weor’s Gnostic Movement - David G. Robertson
  15. Using the New Religious Movements Framework to Consider LGBT Muslim Groups - Shanon Shah
  16.  

    Part Four: New Prophecies or Revelations

  17. Digital Revisionism: The Aftermath of the Family International’s Reboot - Claire Borowik
  18. The Mexican Santa Muerte from Tepito to Tultitlán: Tradition, Innovation and Syncretism at Enriqueta Vargas’ Temple - Stefano Bigliardi, Fabrizio Lorusso, And Stefano Morrone
  19. From the Church of Satan to the Temple of Set: Revisionism in the Satanic Milieu - Eugene V. Gallagher
  20. The ‘Messenger’ as Source of Both Stabilization and Revisionism in Church Universal and Triumphant and Related Groups - Erin Prophet

https://www.routledge.com/Radical-Transformations-in-Minority-Religions/Singler-Barker/p/book/9780415786706 

Feb 24, 2021

Can Cult Studies Offer Help With QAnon? The Science Is Thin.

Can Cult Studies Offer Help With QAnon? The Science Is Thin.
Many families have become divided over online political conspiracy theories, but the science on “brainwashing” is weak.

MICHAEL SCHULSON
Undark
February 24, 2021

DAYS BEFORE THE inauguration of President Joe Biden, at a time when some Americans were animated by the false conviction that former President Donald J. Trump had actually won the November election, a man in Colorado began texting warnings to his family. The coming days, he wrote, would be “the most important since World War II.” Trump had invoked the Insurrection Act, the man believed, and he was arresting enemies in the Vatican and other countries. Predicting turbulence ahead, the man urged his wife and two adult children to begin stockpiling essential goods.

“Watch how the world and the United States are saved!” he wrote.

The man had shown an affinity for conspiracy theories in the past, according to one of his sons, who shared the text messages with Undark, requesting that his name and other identifying characteristics of his family be withheld because he feared exposing his father to public ridicule. Recently, however, his father’s preoccupations had taken a more hard-edged and political turn — often following the twisting storylines of QAnon, a collection of right-wing conspiracy theories that describe Trump and his allies battling an international cabal of liberal pedophiles.

His father’s texts about preparing for national upheaval worried the man, and he says he began checking corners and closets in the house to see if his father was indeed stockpiling supplies. He also ordered a book by Steven Hassan, a mental health counselor in Massachusetts who calls himself “America’s leading cult expert.” And he began looking — mostly, he said, just out of curiosity — for resources on “deprogramming” a loved one whom he worried had been brainwashed.

He is far from alone in trying anew to make sense of conspiracist thinking. Since Trump supporters stormed the U.S. Capitol on Jan. 6, many carrying signs and wearing clothing emblazoned with references to “Q,” deradicalization experts who cut their teeth on studies of militant Islamic ideologies have turned their attention to Trump-aligned right-wing extremists. Social psychologists who study conspiracy theorists and misinformation have also seen a sudden spike in interest in their work.

But some Americans have also begun using the language of cults and turning to specialists in cultic studies to make sense of the surge of online disinformation and conspiratorial thinking that have accompanied Trump’s rise.

“It is not hyperbole labeling MAGA as a cult,” the progressive activist Travis Akers wrote on Twitter in late January, referring to Trump’s “Make America Great Again” slogan, and adding that hard-line Trump supporters “are sick and need help.” Television journalist Katie Couric asked “how are we going to really almost deprogram these people who have signed up for the cult of Trump?” Democratic U.S. Representative Jamie Raskin, the lead impeachment manager during Trump’s second trial, recently compared the Republican Party to a cult. And in a Reddit group where anguished relatives of QAnon adherents gather for support, or to swap various anti-cult strategies, there are many references to Hassan’s and other experts’ work.

“I’m inundated, daily, with families freaking out,” said Pat Ryan, a cult mediation expert in Philadelphia. Daniel Shaw, a psychoanalyst in the New York City area who often works with ex-group members, also described an uptick in interest. “I’ve been receiving many, many inquiries from terrified family members about a loved one who is completely lost — mentally, emotionally — in the rabbit hole of conspiracy theories,” Shaw said.

Hassan, Ryan, and Shaw are part of the small field of cult experts who focus on the experiences of people who join intense ideological movements. Some are trained psychologists and social workers; others are independent scholars and uncredentialed professionals. Many identify as former cult members themselves. But for families hoping to “deprogram” a QAnon-obsessed loved one, it’s unclear how much evidence there is behind the methods of these practitioners.

“I’ve been receiving many, many inquiries from terrified family members about a loved one who is completely lost — mentally, emotionally — in the rabbit hole of conspiracy theories.”

There’s broad agreement that “some groups harm some people sometimes,” said Michael Langone, a counseling psychologist and the director of the International Cultic Studies Association. But members of the field have sometimes clashed with academic experts, and even among themselves, especially over the notion that otherwise healthy people who subscribe to unorthodox belief systems are victims of a mental hijacking. Such thinking has received scant scientific reinforcement since sociologists, psychologists, and religious studies scholars first started pushing back on anti-cult hysteria in the U.S. decades ago. And while few cult specialists today claim to do the sort of deprogramming that gained popularity in the 1970s, some anti-cult practitioners — and licensed psychiatrists — do still embrace the idea that brainwashing and mind control pose real threats, and that they apply to online conspiracies.

Despite this, many other researchers today say that these notions simply discount human agency. For the most part, they say, people gravitate to ideas and assertions they’re already inclined to believe, and those disposed to get enthusiastic or obsessive about things will do just that, of their own volition. Still, for families divided over political conspiracy theories — and even over belief systems involving left-wing, Satan-worshipping child sex rings — many cult experts ultimately settle on advice that makes restoring and cultivating relationships the primary focus.

“Number one: Do not confront. It absolutely does not work,” said Steve Eichel, a clinical psychologist in Delaware and specialist in cult recovery. And number two: “Maintain your relationship with that person no matter what.”


THE ANTI-CULT MOVEMENT emerged in the 1970s, as a wave of new religious groups attracted young followers in the U.S. These included the Rajneeshees, whose rise in Oregon was the subject of a viral 2018 Netflix documentary; the International Society for Krishna Consciousness, better known as the Hare Krishnas; and the Unification Church of the Rev. Sun Myung Moon. These were joined by radical political organizations like the Symbionese Liberation Army, which gained national attention for the kidnapping of Patricia Hearst, an actor and heir who went on to participate in an armed bank robbery with the group.

In some cases, adherents made dramatic changes to their lives, espousing beliefs that many of their friends and relatives found to be bizarre. Some groups took extreme paths: In particular, more than 900 followers of the Peoples Temple, a group based in San Francisco, died in 1978 at Jonestown, the compound their leader had built in Guyana, most from drinking a cyanide-laced punch.

Some alarmed parents and commentators labeled many of these movements cults. They described what happened to their children as brainwashing, and even as a new kind of pathology. “Destructive cultism is a sociopathic illness which is rapidly spreading throughout the U.S. and the rest of the world in the form of a pandemic,” Eli Shapiro, a doctor whose son had joined the Hare Krishnas, wrote in the journal American Family Physician in 1977. Symptoms of the pathology, Shapiro wrote, included “behavioral changes, loss of personal identity, cessation of scholastic activities, estrangement from family, disinterest in society, and pronounced mental control and enslavement by cult leaders.”



News reports throughout the 1970s and 80s offered a steady drumbeat of concern over cults — and related concepts like mind control. But over time, researchers raised questions over the efficacy of “deprogramming” interventions, as well as the idea that members of new religious movements were being brainwashed. Visual: Undark

In response, people began to organize. The American Family Foundation, launched in 1979, offered resources to families in distress. More hard-line groups, like the Cult Awareness Network, helped arrange deprogrammings of group members. In some cases, deprogrammers would kidnap a group member, detain them for hours or days, and use arguments and videos to try to undo the brainwashing.

The anti-cult movement soon ran into opposition from many sociologists and historians of religion, who argued that the anti-cultists often targeted religious movements that, while exotic to most Americans, were doing nothing wrong. They also questioned the very idea that brainwashing and deprogramming were real phenomena. In one landmark study, Eileen Barker, a sociologist at the London School of Economics, spent close to seven years studying members of the Unification Church, whose members are sometime called Moonies, after their leader. Barker followed people who entered church recruitment seminars, and she gave them numerous personality tests to measure things like suggestibility.

Barker argued that, far from experiencing brainwashing, the large majority of people who attended recruitment seminars opted not to join the Unification Church. Those who joined and stayed, she found, actually appeared to be more strong-willed and resistant to suggestion than those who had walked away. People who joined such groups, Barker told Undark, did so because they found something that, for whatever reason, “fitted with what they were looking for and lacked in normal society.” In other words, they were members because they wanted to be members.

Today, scholars like Barker tend to eschew the term cult because of its pejorative connotations, instead sometimes referring to groups like the Unification Church as new religious movements, or NRMs. In response, some cult experts have accused sociologists and scholars of religions of whitewashing the behavior of abusive groups. But the brainwashing model also failed to gain the endorsement of many psychologists. In 1983, the American Psychological Association convened a task force to investigate the issue. The group’s members — mostly clinical psychologists and psychiatrists involved in anti-cult work — argued that groups did indeed draw members in through “deceptive and indirect techniques of persuasion and control.” But the APA’s expert reviewers were skeptical. One complained that sections of the draft report the group produced in 1986 read like an article in The National Enquirer, rather than an academic study.

“In general,” the members of the APA’s ethics board wrote in a letter rejecting the task force’s findings, “the report lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur.” (Clinical psychiatrists have been warmer toward the idea of brainwashing than research psychologists; since 1987, the Diagnostic and Statistical Manual of Mental Disorders, an authoritative source for the field, has warned of “identity disturbance due to prolonged and intense coercive persuasion” that can result from “brainwashing, thought reform, indoctrination while captive,” and other traumas.)

The cultic studies field evolved. The hard-line Cult Awareness Network was bankrupted by legal actions, including a lawsuit stemming from a botched intervention in which deprogrammers seized an 18-year-old Christian fundamentalist, restrained him with handcuffs and duct tape, and held him captive in a beach house at the behest of the man’s mother. Today, Eichel said, deprogrammings are no longer done “by anyone ethical.”

The American Family Foundation began to make peace with the sociologists. The organization also renamed itself the International Cultic Studies Association. And while differences remain among people who study cults and NRMs, Langone, who has run the organization since 1981, said he is now friends with Barker and other scholars who once clashed with his organization.

Michael Kropveld, who runs the Center for Assistance and for the Study of Cultic Phenomena, or Info-Cult, in Montreal, got his start in the field in 1978, when he helped organize the deprogramming of a friend who had joined the Unification Church. Since then, his approach has mellowed — the organization long ago abandoned deprogramming, and Kropveld said that he now finds the concept of brainwashing to be lacking.

“Using terms like brainwashing or mind control tend to imply some magical kind of process that goes on that happens to people that are unaware of what’s happening to them,” he said. Kropveld believes that techniques of influence exist, but he thinks the reasons people gravitate to groups tend to be more complicated and individualized.

Still, he acknowledged, ideas like brainwashing have an appeal. “Simplistic messages” with vivid labels, he said, “are the ones that get the most attention.”


SOME CULT EXPERTS continue to find ideas like brainwashing to be useful. One of the most prominent is Steven Hassan, a former member of the Unification Church and the author of “Combating Mind Control.” In the past, Hassan has described the internet as a vehicle for mind control and “subliminal programming,” and he recently alleged that transgender “hypno porn” is being used as a form of “weaponized mind control” to recruit young people into gender transitions.

Watching Trump run for office in 2016 led to “a bizarre kind of déjà vu,” Hassan wrote in his most recent book, “The Cult of Trump.” “It struck me that Trump was exhibiting many of the same behaviors that I had seen in the late Korean cult leader Sun Myung Moon, whom I had worshipped as the messiah in the mid-70s.”

“To jump from not liking Trump to Trump as cult leader, I think, is a bit of a leap,” Langone said.

In the days since the Jan. 6 attack on the Capitol, Hassan has offered expert analysis for CNN, The Boston Globe, Vanity Fair, and other outlets, and he has fielded questions from a popular Reddit group for people whose loved ones are QAnon adherents. (Through an assistant, Hassan declined requests for an interview with Undark, citing a busy schedule.)

Some people outside the cultic studies world have also made similar arguments, including Bandy X. Lee, a forensic psychiatrist and consultant for the World Health Organization who, until recently, taught at Yale. In an email to Undark, Lee, who has helped promote Hassan’s work, wrote that a segment of Trump’s followers resembles cult members and suggested that the former president had cultivated a kind of mass psychosis.

She applies that analysis to a wide range of right-wing positions. Asked in a phone interview whether someone who believes that climate change is overblown and that progressive tax policy is a bad idea could be said to have an individual pathology, Lee demurred. “No,” she said, “I describe them as being victims of abuse.” Specifically, she explained, they suffered from “the abuse of systems that politics and industry have employed to psychologically manipulate the population into accepting policies that undermine their health, wellbeing, and even livelihood and lives.”

Not all experts in the cultic studies world buy this. Langone, the ICSA leader, specifically praised Hassan’s contributions to the field, but acknowledged that he’s skeptical of describing Trump followers as cultists. “I can understand why people don’t like Trump,” Langone said. “But to jump from not liking Trump to Trump as cult leader, I think, is a bit of a leap.” He also fears the cultic element of QAnon is “overplayed by some of my colleagues in this field” and that the influence of QAnon itself may be overstated by media coverage.

Allegations of brainwashing are also out of step with some recent psychology research on misinformation and conspiracy theories. “How much of someone going down that rabbit hole is due to that person’s need, in a way — or this misinformation or this activity, this community — rather than these methods being pushed by whatever person is in charge?” asked Hugo Mercier, a cognitive psychologist at Institut Jean Nicod in Paris and author of the 2020 book “Not Born Yesterday: The Science of Who We Trust and What We Believe.”

Mercier argues that the brainwashing model often gets the process backward: Rather than tricking people into harmful thinking, effective propaganda — or even pure misinformation — gives them permission to openly express ideas they already found appealing.

Gordon Pennycook, a social psychologist at the University of Regina in Canada, also argues that, while it may seem to relatives that someone has changed suddenly as they fall down a rabbit hole, such accounts typically misapprehend the sequence of events. “It’s not that their minds are being taken over,” he said. “Their minds were susceptible to it in the first place. What’s been taken over is their interests, and their focus, and so on.” People who gravitate to conspiracies, Pennycook says, have consistent personality traits that make those ideas appealing. “It’s not the conspiracies that are causing them to be overly aggressive and resistant to alternative narratives,” Pennycook said. Instead, those traits are “the reason they are so strongly believing in the conspiracies.”

Many scholars of new religious movements are also skeptical of the idea that disinformation and conspiracy theories should be understood as somehow hijacking people’s minds. Megan Goodwin, a scholar of American minority religions at Northeastern University, said she has heard people describe outlets like Fox News as brainwashing. “People who are watching it are adults who are making choices to consume that media,” said Goodwin. Similarly, she said, “the people who mounted an armed insurrection to take over the Capitol are adults that made choices.” An idea like deprogramming, she added, “makes it sound like, okay, well they’ve had their agency and their faculties taken from them.”

She sees no evidence that’s the case, even if, she said, that narrative can be comforting. “They make shitty choices,” she said. “People you love are going to make shitty choices.”


SOME FAMILIES HAVE gravitated toward cult specialists in the hopes that they can, indeed, help rescue a loved one from the tangled communities that grow around online conspiracy theories — and there are such specialists who say they can offer useful guidance, even if they can’t stage a full extraction. One of those is Ryan, the cult mediation specialist in Philadelphia. Raised in Florida, Ryan joined the Transcendental Meditation movement in his late teens and spent more than a decade as an avid practitioner of the popular global meditation movement, which was founded in the 1950s. Eventually, he came to believe he was part of a cult and left.

Whether it’s to field worries about a conspiratorial loved one or to mediate disagreement over membership in a religious movement, families who work with him fill out long questionnaires and may eventually participate in sessions that involve Ryan, his business partner, and a licensed psychiatrist. (Ryan, who has a degree in Eastern philosophy and business from Maharishi International University in Iowa, is not a licensed mental health counselor. That lets him intervene in “a way that it would be difficult for me to do given my professional license,” said Eichel, the Delaware psychologist, who sometimes refers families to Ryan.)

Ryan stressed that interventions are rare; usually, the extent of their work is helping families develop strategies to maintain a relationship. When Ryan and the family do decide on an intervention, it involves months of preparation. They sometimes employ elaborate ruses to coax the person into the room for a conversation with their relatives and Ryan.

“They make shitty choices,” Goodwin said. “People you love are going to make shitty choices.”

Whether such methods are reliably effective is difficult to ascertain, and, practitioners acknowledge, there is little research on outcomes. “You can be simplistic, and lucky, and get the person out,” said Langone, the ICSA head, stressing that people’s reasons for joining and leaving groups are often highly individualized. “There are not good statistics on the effectiveness of exit counseling,” Langone said.

During a conversation in late January, Ryan estimated that, within the past year, he had consulted for roughly 20 families dealing with loved ones who had gone deep into QAnon or a similar community. He has not recommended formal interventions to any of them. “The basis of what we would recommend is to stay connected, and how to do that,” said Ryan. “Because to influence someone, you have to have a relationship with them.”

For now, the son of the Colorado conspiracy theorist said he’s gotten adept at finding ways to exit uncomfortable conversations, and he does what he can to lay low and avoid confrontation. He thinks anything else is likely to be ineffective. “I think it’s just going to ride itself out,” he said earlier this month.

He’s now less confident that will happen — especially since after the inauguration his father moved on to sharing anti-vaccination theories with his family — and he’s unsure of what the future will hold. “I just I don’t know where any of this is going to go,” he said, “with the way that there’s just so much crazy going on right now in the United States.”

Michael Schulson is a contributing editor for Undark. His work has also been published by Aeon, NPR, Pacific Standard, Scientific American, Slate, and Wired, among other publications.

https://undark.org/2021/02/24/cult-studies-qanon/

Feb 7, 2016

Inner Experience and Conversion

Michael Langone, Ph.D.; Joseph Kelly; Patrick Ryan

Abstract
Cognitive therapy is similar to religious conversion in that both are associated with changes in a person's fundamental assumptions about the world, self, and others. These fundamental assumptions derive in large part from experience, rather than rational deliberation. In some conversions, powerful inner experiences, whether manipulated ("outer generated") or not ("inner generated"), may cause a person to adopt new fundamental assumptions. Sometimes, a new set of experiences can cause a convert to reject the new assumptions and leave the group. The resulting disillusionment may cause serious adjustment problems. The impact and implications of inner experiences should be considered when trying to help former group members.
The Compact Edition of the Oxford English Dictionary defines conversion as “the action of converting or fact of being converted to some opinion, belief, party, etc.” (p. 546). This definition makes a useful distinction between “converting” and “being converted,” what I have sometimes referred to as “inner-generated” and “outer-generated” conversions.
We typically associate conversions to cultic groups as “outer-generated”; that is, as being in large part the product of manipulation and deception. But not all conversions are manipulated, not even all cultic conversions. As Zablocki has pointed out, what many of us would call cultic environments are characterized more by the difficulty people have getting out than by the diverse ways through which they get in (Zablocki, 2001). Hence, conversion to cultic groups cannot always be explained by theories of manipulation. We need other models that take into account, but are not limited by, factors of manipulation.
In this brief paper, I will propose another way to look at conversion. What I will discuss does not rise to the level of being a “theory.” I hope, however, that it points the way toward a more useful theory than those we currently have.
By definition, all conversions—manipulated and non-manipulated—presume that one’s way of viewing and relating to the world has changed in some fundamental way. (I don’t use the term “conversion” here to refer to changes of religion that are made, for example, to maintain marital harmony. I use the term only to refer to genuine and significant changes in worldview.)
What accounts for such dramatic change? Nobody really knows. There are many theories of conversion. Indeed, the disciplines that study conversion—psychology, theology, religious studies, anthropology, and sociology—embrace many competing theories.
I prefer and will discuss here a cognitive psychological approach, which assumes that human beings tend toward logical consistency in their beliefs and behaviors. I say, “tend toward” because only a fool would deny that we human beings aren’t nearly so logical as we think we are. Indeed, one of the more widely respected psychological theories—i.e., the theory of cognitive dissonance (Festinger, 1957)—addresses the ways in which people resolve inconsistencies between and among their beliefs and behaviors. Nevertheless, that we are bothered by logical inconsistencies testifies to our tendency to seek logical consistency.
The cognitive approach (Beck, 1979) assumes that people have a limited set of core assumptions about the world, the self, and others, and that numerous peripheral beliefs derive from these core assumptions. These beliefs—core and peripheral—have action consequences. When the beliefs are disordered or out of touch with reality, psychopathological behavior may ensue. Thus, Alfred Adler, Freud’s first dissenting disciple and the first modern cognitive psychologist, talked about the individual’s “private logic” (Ansbacher & Ansbacher, 1979). Neurotic individuals, according to Adler, are neurotic because their private logic includes beliefs about the world, self, and others (e.g., “I must be perfect in all that I do or I am nothing”) that cause them to come into conflict with or withdraw from other people. According to Adler, the individual’s faulty private logic develops not from how he or she handled childhood sexuality, as Freud maintained, but from how he or she handled the inferiority that is the natural condition of all children. To Adler, the key factor in development is not that children are sexual, but that they are little and weak.
Children’s fundamental assumptions about world, self, and others develop from how they and their environments respond to the unavoidable starting condition of weakness and dependence. In normal development, little, weak children are typically raised in loving, secure homes that reward their small steps toward maturity, thereby enabling them to develop a healthy self-esteem and learn how to manage in the social world that all but hermits inhabit. In neurotic development, children are typically raised in an emotionally stunted and psychologically unsafe home in which their small steps may be disparaged or ignored, causing them to develop assumptions about life that may lock them, for example, in defeatist (e.g., “I am a loser who will fail in all that I attempt”) or pretentiously compensatory (e.g., “I must be perfect in all that I do”) patterns of behavior. (Needless to say, some individuals can respond to deficient childhood environments in ways that lead them to become healthy adults, despite the environment in which they were raised. But the odds of healthy development in such an environment are, to say the least, less than in a loving, secure environment.)
The important point to keep in mind is that our fundamental assumptions about life emerge in large part from our experience, not from our rational deliberations.
Modern cognitive therapists, though rarely acknowledging Adler, say much the same thing, only more systematically. Aaron Beck, the father of modern cognitive therapy, calls the individual’s core assumptions “schemas” (Beck, 1979); Albert Ellis, founder of Rational Emotive Therapy, talks about the irrational assumptions that troubled people hold (Ellis & Harper, 1975). Indeed, psychologists have even developed instruments for assessing the ways in which a person’s thinking may be out of whack. One such measure, for example, is called the “Dysfunctional Attitude Scale.”
Cognitive therapists believe that they can more effectively help distressed people by teaching them how to recognize and challenge the core assumptions that generate conflict, and how to try out and practice assumptions and behaviors that are likely to have more desirable consequences. Hence, the perfectionist operating on the assumption that “I must be perfect in all that I do” is tactfully guided (although in Albert Ellis’s case, the individual may be bluntly directed) to the realization that this belief is irrational and produces unhappiness. Of course, helping a client get to this realization is no easy task to accomplish and requires much more tact and skill than this summary statement implies. Making such a fundamental change in one’s life doesn’t result only from rational discussion, although this can be an important factor. The change results in large part from personally experiencing the consequences of behaviors associated with other fundamental assumptions—however tentatively and even reluctantly one may have attempted these new behaviors, typically with the support and encouragement of the therapist, family, and friends.
Now, what does all of this have to do with conversion?
In conversion, as in cognitive therapy, a person’s fundamental assumptions change, and he or she tries out new behaviors consistent with the new assumptions and finds them to work, at least temporarily. Sometimes, before the conversion, the convert, like the therapy client, is troubled and unhappy with how his or her life is going. But sometimes the convert’s life is working just fine. What causes the change?
There is no simple answer to this question because there are many types of conversion, involving many types of people, coming from many types of circumstances. Hence, in what follows, I make no claim to explain all conversions. I merely hope to illuminate some.
I believe that, as with the cognitive-therapy client, personal experiences, particularly compelling inner experiences, are often the dominant factor in changing fundamental assumptions. These inner experiences may be engineered, as is sometimes the case with certain large, group-awareness trainings or the classic Moonie Booneville weekend. They may sometimes be a reaction to seemingly paranormal actions of a guru or other person claiming some kind of divine mandate, such as Sai Baba’s appearing to make objects materialize out of thin air. Sometimes, the process of reevaluating one’s fundamental assumptions may be stimulated by the experience of meeting a person who operates under a radically different set of assumptions and who appears to have achieved an enviable level of happiness or inner peace.
Once such experiences cause us to reorder, or begin reordering, our fundamental assumptions, the natural human tendency to be logically consistent drives us, over time, to reconsider and, if necessary, rearrange our peripheral beliefs and behaviors to make them more consistent with the new assumptions we are embracing. Such a process may be intellectually and emotionally challenging, so it is not surprising that we will reach out to others for support and guidance. In highly manipulative groups, somebody is always waiting in the wing to make sure that one draws the “correct” conclusions from the compelling experience that elicited the reevaluation process. In less high-pressure, more ethical groups, members may encourage a prospective convert to think carefully about the new belief system in private and over a period of time. A Franciscan priest, for example, once told me that novices were encouraged to spend a year “in the world” before taking their vows, to make sure that their vow-taking truly reflected an inner calling and wasn’t merely a superficial response to psychological needs.
In some cultic groups it is not unusual for a person to go through the following stages:
§  Prospective converts perceive the leader as having some special ability or charisma (he reads minds; he heals people; he induces altered states of consciousness in people) that triggers a powerful inner experience (e.g., of the leader's spiritual "presence"), which in turn causes them to reconsider their assumptions about the world, self, and others.
§  The leader’s minions, who become aware of prospects' openness to their belief system, will, frequently with much genuine concern and sincerity, do what they can to ensure that they make the “correct” interpretation of those experiences.
§  Prospects come to accept, at least provisionally, the fundamental assumptions, what I have sometimes called the “ruling propositions” on which the group is based—e.g., guru is God incarnate, pastor Bob is a modern-day prophet, Sister Veronica is God’s messenger. The leader and/or group thus come to have a high level of credibility and authority for the prospect.
§  Prospects yield to these pressures, whether they be mild or strong, and reach a point at which they implicitly if not explicitly declare, “I believe!” The initial declaration is usually directed at the ruling propositions, e.g., "guru is God incarnate."  Prospects are now converts.
§  Converts rearrange their peripheral beliefs and behaviors to make them more consistent with the new set of assumptions and their derivative peripheral beliefs and behaviors. Again, these actions frequently are accompanied by varying degrees of social guidance and/or pressure.  For example, accepting that "guru is God" implies obeying guru, even if his orders make no sense ("Whom am I to question God?").
§  Converts become comfortable with the new set of beliefs and behaviors and begin to live according to them.
§  Other group members, sometimes without realizing it, provide rewards and punishments that tend to strengthen new converts' loyalty to the group.
§  Converts become aware of inconsistencies, contradictions, abuses, or failed predictions within the group or organization.
§  Normal cognitive dissonance processes combined with group pressures cause the member to search for rationalizations to explain away these disturbing discrepancies.
§  So many disturbing discrepancies accumulate that, as one ex-member put it, the shelf of rationalization on which they were placed collapses.
§  Members once again begin to reconsider fundamental assumptions; only this time they reconsider the assumptions, the ruling propositions, of the group to which they had claimed allegiance, sometimes for many years.  Support from family, friends, or professionals can sometimes facilitate such reevaluation and a decision to leave the group.
This process can sometimes be painfully disillusioning to group members or former group members, who may be reluctant to "trust," to attribute credibility to future spiritual experiences (Lucas, 2003).  Although models that stress manipulation may apply to some such cases, they do not necessarily apply.  And even when they do, the individual's inner experiences, which affect what he or she believes, are likely to have had a profound impact.  This impact and its implications should be addressed when trying to help former group members adjust to life outside the group.

References

Ansbacher, H., & Ansbacher, R. (Eds.). (1979). Superiority and social interest: A collection of later writings [of Alfred Adler], 3rd revised edition. New York: Norton.
Beck, A. T. (1979). Cognitive therapy of depression.  New York: Guilford Press.
Compact Edition of the Oxford English Dictionary.  (1971). Oxford: Oxford University Press.
Ellis, A., & Harper, R.A. (1975). A new guide to rational living. Englewood Cliffs, NJ: Prentice-Hall.
Festinger, L. (1957). A theory of cognitive dissonance. Stanford: Stanford University Press.
Lucas, P. (2003). Spiritual harm in new religions: Reflections on interviews with former members of NRMs. Cultic Studies Review (1) www.culticstudiesreview.org.
Zablocki, B. (2001). Towards a demystified and disinterested scientific theory of brainwashing.  In B. Zablocki & T. Robbins (Eds.), Misunderstanding cults: Searching for objectivity in a controversial field, pp. 159-214. Toronto: University of Toronto Press.
This material was originally prepared for a presentation at AFF’s annual conference, June 14-15, 2002, at the Crowne Plaza Hotel, Orlando (FL) Airport.

Feb 27, 2014

Cults, Psychological Manipulation and Society: International Perspectives — An Overview

Michael D. Langone

This article was originally presented as a paper at the AFF (American Family Foundation) Annual Conference held at St. Paul Campus, University of Minnesota, on 14 May 1999 by Dr Michael D. Langone, Executive Director of the AFF and editor of the Cultic Studies Journal.

This conference's title, 'Cults, Psychological Manipulation, and Society: International Perspectives', is significant because cults and related groups have aroused significant concern around the world. I am aware of organisations concerned about cults in the following countries: USA, Canada, Mexico, Argentina, Brazil, United Kingdom, Norway, Sweden, Denmark, France, Spain, Italy, Germany, Switzerland, Belgium, the Netherlands, Austria, Poland, Greece, Russia, Malta, Israel, Japan and Australia. There are probably some of which I am not aware. The concern tends to focus on, though not be limited to, issues related to psychological manipulation and its impact on society. Concerns generate much confusion and disputation, in large part because people define the term 'cult' in different ways.