Showing posts with label Prophecy. Show all posts
Showing posts with label Prophecy. Show all posts

Jul 5, 2021

What Makes a Cult a Cult?

The line between delusion and what the rest of us believe may be blurrier than we think.

Zoë Heller
New Yorker
July 5, 2021
July 12 & 19, 2021 Issue

Listen to this Story

Male cult leaders sometimes claim droit du seigneur over female followers or use physical violence to sexually exploit them. But, on the whole, they find it more efficient to dress up the exploitation as some sort of gift or therapy: an opportunity to serve God, an exorcism of "hangups," a fast track to spiritual enlightenment. One stratagem favored by Keith Raniere, the leader of the New York-based self-help cult NXIVM, was to tell the female disciples in his inner circle that they had been high-ranking Nazis in their former lives, and that having yogic sex with him was a way to shift the residual bad energy lurking in their systems.

Don't Call It a Cult
According to Sarah Berman, whose book "Don't Call It a Cult" (Steerforth) focusses on the experiences of NXIVM's women members, Raniere was especially alert to the manipulative uses of shame and guilt. When he eventually retired his Nazi story—surmising, perhaps, that there were limits to how many reincarnated S.S. officers one group could plausibly contain—he replaced it with another narrative designed to stimulate self-loathing. He told the women that the privileges of their gender had weakened them, turned them into prideful "princesses," and that, in order to be freed from the prison of their mewling femininity, they needed to submit to a program of discipline and suffering. This became the sales spiel for the NXIVM subgroup DOS (Dominus Obsequious Sororium, dog Latin for "Master of the Obedient Sisterhood"), a pyramid scheme of sexual slavery in which members underwrote their vow of obedience to Raniere by having his initials branded on their groins and handing over collateral in the form of compromising personal information and nude photos. At the time of Raniere's arrest, in 2018, on charges of sex trafficking, racketeering, and other crimes, DOS was estimated to have more than a hundred members and it had been acquiring equipment for a B.D.S.M. dungeon. Among the orders: a steel puppy cage, for those members "most committed to growth."

Given that NXIVM has already been the subject of two TV documentary series, a podcast, four memoirs, and a Lifetime movie, it would be unfair to expect Berman's book to present much in the way of new insights about the cult. Berman provides some interesting details about Raniere's background in multilevel-marketing scams and interviews one of Raniere's old schoolmates, who remembers him, unsurprisingly, as an insecure bully. However, to the central question of how "normal" women wound up participating in Raniere's sadistic fantasies, she offers essentially the same answer as everyone else. They were lured in by Raniere's purportedly life-changing self-actualization "tech" (a salad of borrowings from est, Scientology, and Ayn Rand) and then whacked with a raft of brainwashing techniques. They were gaslit, demoralized, sleep-deprived, put on starvation diets, isolated from their friends and families, and subjected to a scientifically dubious form of psychotherapy known as neurolinguistic programming. Raniere was, as the U.S. Attorney whose office prosecuted the case put it, "a modern-day Svengali" and his followers were mesmerized pawns.

Until very recently, Berman argues, we would not have recognized the victimhood of women who consented to their own abuse: "It has taken the #MeToo movement, and with it a paradigm shift in our understanding of sexual abuse, to even begin to realize that this kind of 'complicity' does not disqualify women . . . from seeking justice." This rather overstates the case, perhaps. Certainly, the F.B.I. had been sluggish in responding to complaints about NXIVM, and prosecutors were keener to pursue the cult in the wake of the Harvey Weinstein scandal, but, with or without #MeToo, the legal argument against a man who used the threat of blackmail to keep women as his branded sex slaves would have been clear. In fact, Berman and others, in framing the NXIVM story as a #MeToo morality tale about coerced consent, are prone to exaggerate Raniere's mind-controlling powers. The fact that Raniere collected kompromat from DOS members strongly suggests that his psychological coercion techniques were not, by themselves, sufficient to keep women acquiescent. A great many people were, after all, able to resist his spiral-eyed ministrations: they met him, saw a sinister little twerp with a center part who insisted on being addressed as "Vanguard," and, sooner or later, walked away.

It is also striking that the degree of agency attributed to NXIVM members seems to differ depending on how reprehensible their behavior in the cult was. While brainwashing is seen to have nullified the consent of Raniere's DOS "slaves," it is generally not felt to have diminished the moral or legal responsibility of women who committed crimes at his behest. Lauren Salzman and the former television actor Allison Mack, two of the five NXIVM women who have pleaded guilty to crimes committed while in the cult, were both DOS members, and arguably more deeply in Raniere's thrall than most. Yet the media have consistently portrayed them as wicked "lieutenants" who cast themselves beyond the pale of sympathy by "choosing" to deceive and harm other women.

The term "brainwashing" was originally used to describe the thought-reform techniques developed by the Maoist government in China. Its usage in connection with cults began in the early seventies. Stories of young people being transformed into "Manchurian Candidate"-style zombies stoked the paranoia of the era and, for a time, encouraged the practice of kidnapping and "deprogramming" cult members. Yet, despite the lasting hold of brainwashing on the public imagination, the scientific community has always regarded the term with some skepticism. Civil-rights organizations and scholars of religion have strenuously objected to using an unproven—and unprovable—hypothesis to discredit the self-determination of competent adults. Attempts by former cult members to use the "brainwashing defense" to avoid conviction for crimes have repeatedly failed. Methods of coercive persuasion undoubtedly exist, but the notion of a foolproof method for destroying free will and reducing people to robots is now rejected by almost all cult experts. Even the historian and psychiatrist Robert Lifton, whose book "Thought Reform and the Psychology of Totalism" (1961) provided one of the earliest and most influential accounts of coercive persuasion, has been careful to point out that brainwashing is neither "all-powerful" nor "irresistible." In a recent volume of essays, "Losing Reality" (2019), he writes that cultic conversion generally involves an element of "voluntary self-surrender."

If we accept that cult members have some degree of volition, the job of distinguishing cults from other belief-based organizations becomes a good deal more difficult. We may recoil from Keith Raniere's brand of malevolent claptrap, but, if he hadn't physically abused followers and committed crimes, would we be able to explain why NXIVM is inherently more coercive or exploitative than any of the "high demand" religions we tolerate? For this reason, many scholars choose to avoid the term "cult" altogether. Raniere may have set himself up as an unerring source of wisdom and sought to shut his minions off from outside influence, but apparently so did Jesus of Nazareth. The Gospel of Luke records him saying, "If any man come to me, and hate not his father, and mother, and wife, and children, and brethren, and sisters, yea, and his own life also, he cannot be my disciple." Religion, as the old joke has it, is just "a cult plus time."

Acknowledging that joining a cult requires an element of voluntary self-surrender also obliges us to consider whether the very relinquishment of control isn't a significant part of the appeal. In HBO's NXIVM documentary, "The Vow," a seemingly sadder and wiser former member says, "Nobody joins a cult. Nobody. They join a good thing, and then they realize they were fucked." The force of this statement is somewhat undermined when you discover that the man speaking is a veteran not only of NXIVM but also of Ramtha's School of Enlightenment, a group in the Pacific Northwest led by a woman who claims to channel the wisdom of a "Lemurian warrior" from thirty-five thousand years ago. To join one cult may be considered a misfortune; to join two looks like a predilection for the cult experience.

"Not passive victims, they themselves actively sought to be controlled," Haruki Murakami wrote of the members of Aum Shinrikyo, the cult whose sarin-gas attack on the Tokyo subway, in 1995, killed thirteen people. In his book "Underground" (1997), Murakami describes most Aum members as having "deposited all their precious personal holdings of selfhood" in the "spiritual bank" of the cult's leader, Shoko Asahara. Submitting to a higher authority—to someone else's account of reality—was, he claims, their aim. Robert Lifton suggests that people with certain kinds of personal history are more likely to experience such a longing: those with "an early sense of confusion and dislocation," or, at the opposite extreme, "an early experience of unusually intense family milieu control." But he stresses that the capacity for totalist submission lurks in all of us and is probably rooted in childhood, the prolonged period of dependence during which we have no choice but to attribute to our parents "an exaggerated omnipotence." (This might help to explain why so many cult leaders choose to style themselves as the fathers or mothers of their cult "families.")

Some scholars theorize that levels of religiosity and cultic affiliation tend to rise in proportion to the perceived uncertainty of an environment. The less control we feel we have over our circumstances, the more likely we are to entrust our fates to a higher power. (A classic example of this relationship was provided by the anthropologist Bronisław Malinowski, who found that fishermen in the Trobriand Islands, off the coast of New Guinea, engaged in more magic rituals the farther out to sea they went.) This propensity has been offered as an explanation for why cults proliferated during the social and political tumult of the nineteen-sixties, and why levels of religiosity have remained higher in America than in other industrialized countries. Americans, it is argued, experience significantly more economic precarity than people in nations with stronger social safety nets and consequently are more inclined to seek alternative sources of comfort.

Leaving Isn't the Hardest Thing
The problem with any psychiatric or sociological explanation of belief is that it tends to have a slightly patronizing ring. People understandably grow irritated when told that their most deeply held convictions are their "opium." (Witness the outrage that Barack Obama faced when he spoke of jobless Americans in the Rust Belt clinging "to guns or religion.") Lauren Hough, in her collection of autobiographical essays, "Leaving Isn't the Hardest Thing," gives a persuasive account of the social and economic forces that may help to make cults alluring, while resisting the notion that cult recruits are merely defeated "surrenderers."

Hough spent the first fifteen years of her life in the Children of God, a Christian cult in which pedophilia was understood to have divine sanction and women members were enjoined to become, as one former member recalled, "God's whores." Despite Hough's enduring contempt for those who abused her, her experiences as a minimum-wage worker in mainstream America have convinced her that what the Children of God preached about the inequity of the American system was actually correct. The miseries and indignities that this country visits on its precariat class are enough, she claims, to make anyone want to join a cult. Yet people who choose to do so are not necessarily hapless creatures, buffeted into delusion by social currents they do not comprehend; they are often idealists seeking to create a better world. Of her own parents' decision to join the Children of God, she writes, "All they saw was the misery wrought by greed—the poverty and war, the loneliness and the fucking cruelty of it all. So they joined a commune, a community where people shared what little they had, where people spoke of love and peace, a world without money, a cause. A family. Picked the wrong goddamn commune. But who didn't."

When Prophecy Fails
People's attachment to an initial, idealistic vision of a cult often keeps them in it, long after experience would appear to have exposed the fantasy. The psychologist Leon Festinger proposed the theory of "cognitive dissonance" to describe the unpleasant feeling that arises when an established belief is confronted by clearly contradictory evidence. In the classic study "When Prophecy Fails" (1956), Festinger and his co-authors relate what happened to a small cult in the Midwest when the prophecies of its leader, Dorothy Martin, did not come to pass. Martin claimed to have been informed by various disembodied beings that a cataclysmic flood would consume America on December 21, 1954, and that prior to this apocalypse, on August 1, 1954, she and her followers would be rescued by a fleet of flying saucers. When the aliens did not appear, some members of the group became disillusioned and immediately departed, but others dealt with their discomfiture by doubling down on their conviction. They not only stuck with Martin but began, for the first time, to actively proselytize about the imminent arrival of the saucers.

Better to Have Gone
This counterintuitive response to dashed hopes animates Akash Kapur's "Better to Have Gone" (Scribner), an account of Auroville, an "intentional community" founded in southern India in 1968. Auroville was the inspiration of Blanche Alfassa, a Frenchwoman known to her spiritual followers as the Mother. She claimed to have learned from her guru, Sri Aurobindo, a system of "integral yoga," capable of effecting "cellular transformation" and ultimately granting immortality to its practitioners. She intended Auroville (its name alludes both to Sri Aurobindo and to aurore, the French word for dawn) to be the home of integral yoga and the cradle of a future race of immortal, "supramental" men and women.

The Mother does not appear to have had the totalitarian impulses of a true cult leader, but her teachings inspired a cultlike zealotry in her followers. When, five years after Auroville's founding, she failed to achieve the long-promised cellular transformation and died, at the age of ninety-five, the fledgling community went slightly berserk. "She never prepared us for the possibility that she would leave her body," one of the original community members tells Kapur. "I was totally blown away. Actually, I'm still in shock." To preserve the Mother's vision, a militant group of believers, known as the Collective, shut down schools, burned books in the town library, shaved their heads, and tried to drive off those members of the community whom they considered insufficiently devout.

Kapur and his wife both grew up in Auroville, and he interweaves his history of the community with the story of his wife's mother, Diane Maes, and her boyfriend, John Walker, a pair of Aurovillean pioneers who became casualties of what he calls "the search for perfection." In the seventies, Diane suffered a catastrophic fall while helping to build Auroville's architectural centerpiece, the Mother's Temple. In deference to the Mother's teachings, she rejected long-term treatment and focussed on achieving cellular transformation; she never walked again. When John contracted a severe parasitic illness, he refused medical treatment, too, and eventually died. Shortly afterward, Diane committed suicide, hoping to join him and the Mother in eternal life.

Kapur is, by his own account, a person who both mistrusts faith and envies it, who lives closer to "the side of reason" but suspects that his skepticism may represent a failure of the imagination. Although he acknowledges that Diane and John's commitment to their spiritual beliefs killed them, he is not quite prepared to call their faith misplaced. There was, he believes, something "noble, even exalted," about the steadfastness of their conviction. And, while he is appalled by the fanaticism that gripped Auroville, he is grateful for the sacrifices of the pioneers.

Auroville ultimately survived its cultural revolution. The militant frenzy of the Collective subsided, and the community was placed under the administration of the Indian government. Kapur and his wife, after nearly twenty years away, returned there to live. Fifty years after its founding, Auroville may not be the "ideal city" of immortals that the Mother envisaged, but it is still, Kapur believes, a testament to the devotion of its pioneers. "I'm proud that despite our inevitable compromises and appeasements, we've nonetheless managed to create a society—or at least the embers of a society—that is somewhat egalitarian, and that endeavors to move beyond the materialism that engulfs the rest of the planet."

Kapur gives too sketchy a portrait of present-day Auroville for us to confidently judge how much of a triumph the town—population thirty-three hundred—really represents, or whether integral yoga was integral to its success. (Norway has figured out how to be "somewhat egalitarian" without the benefit of a guru's numinous wisdom.) Whether or not one shares Kapur's admiration for the spiritual certainties of his forefathers and mothers, it seems possible that Auroville prospered in spite of, rather than because of, these certainties—that what in the end saved the community from cultic madness and eventual implosion was precisely not faith, not the Mother's totalist vision, but pluralism, tolerance, and the dull "compromises and appeasements" of civic life.

Far from Auroville, it's tempting to take pluralism and tolerance for granted, but both have fared poorly in Internet-age America. The silos of political groupthink created by social media have turned out to be ideal settings for the germination and dissemination of extremist ideas and alternative realities. To date, the most significant and frightening cultic phenomenon to arise from social media is QAnon. According to some observers, the QAnon movement does not qualify as a proper cult, because it lacks a single charismatic leader. Donald Trump is a hero of the movement, but not its controller. "Q," the online presence whose gnomic briefings—"Q drops"—form the basis of the QAnon mythology, is arguably a leader of sorts, but the army of "gurus" and "promoters" who decode, interpret, and embroider Q's utterances have shown themselves perfectly capable of generating doctrine and inciting violence in the absence of Q's directives. (Q has not posted anything since December, but the prophecies and conspiracies have continued to proliferate.) It's possible that our traditional definitions of what constitutes a cult organization will have to adapt to the Internet age and a new model of crowdsourced cult.

Liberals have good reason to worry about the political reach of QAnon. A survey published in May by the Public Religion Research Institute found that fifteen per cent of Americans subscribe to the central QAnon belief that the government is run by a cabal of Satan-worshipping pedophiles and that twenty per cent believe that "there is a storm coming soon that will sweep away the elites in power and restore the rightful leaders." Yet anxiety about the movement tends to be undercut by laughter at the presumed imbecility of its members. Some of the attorneys representing QAnon followers who took part in the invasion of the Capitol have even made this their chief line of defense; Albert Watkins, who represents Jacob Chansley, the bare-chested "Q Shaman," recently told a reporter that his client and other defendants were "people with brain damage, they're fucking retarded."

The Storm Is Upon Us
Mike Rothschild, in his book about the QAnon phenomenon, "The Storm Is Upon Us" (Melville House), argues that contempt and mockery for QAnon beliefs have led people to radically underestimate the movement, and, even now, keep us from engaging seriously with its threat. The QAnon stereotype of a "white American conservative driven to joylessness by their sense of persecution by liberal elites" ought not to blind us to the fact that many of Q's followers, like the members of any cult movement, are people seeking meaning and purpose. "For all of the crimes and violent ideation we've seen, many believers truly want to play a role in making the world a better place," Rothschild writes.

It's not just the political foulness of QAnon that makes us disinclined to empathize with its followers. We harbor a general sense of superiority to those who are taken in by cults. Books and documentaries routinely warn that any of us could be ensnared, that it's merely a matter of being in the wrong place at the wrong time, that the average cult convert is no stupider than anyone else. (Some cults, including Aum Shinrikyo, have attracted disproportionate numbers of highly educated, accomplished recruits.) Yet our sense that joining a cult requires some unusual degree of credulousness or gullibility persists. Few of us believe in our heart of hearts that Amy Carlson, the recently deceased leader of the Colorado-based Love Has Won cult, who claimed to have birthed the whole of creation and to have been, in a previous life, a daughter of Donald Trump, could put us under her spell.

The Delusions of Crowds
Perhaps one way to attack our intellectual hubris on this matter is to remind ourselves that we all hold some beliefs for which there is no compelling evidence. The convictions that Jesus was the son of God and that "everything happens for a reason" are older and more widespread than the belief in Amy Carlson's privileged access to the fifth dimension, but neither is, ultimately, more rational. In recent decades, scholars have grown increasingly adamant that none of our beliefs, rational or otherwise, have much to do with logical reasoning. "People do not deploy the powerful human intellect to dispassionately analyze the world," William J. Bernstein writes, in "The Delusions of Crowds" (Atlantic Monthly). Instead, they "rationalize how the facts conform to their emotionally derived preconceptions."

Bernstein's book, a survey of financial and religious manias, is inspired by Charles Mackay's 1841 work, "Memoirs of Extraordinary Popular Delusions and the Madness of Crowds." Mackay saw crowd dynamics as central to phenomena as disparate as the South Sea Bubble, the Crusades, witch hunts, and alchemy. Bernstein uses the lessons of evolutionary psychology and neuroscience to elucidate some of Mackay's observations, and argues that our propensity to go nuts en masse is determined in part by a hardwired weakness for stories. "Humans understand the world through narratives," he writes. "However much we flatter ourselves about our individual rationality, a good story, no matter how analytically deficient, lingers in the mind, resonates emotionally, and persuades more than the most dispositive facts or data."

It's important to note that Bernstein is referring not just to the stories told by cults but also to ones that lure people into all manner of cons, including financial ones. Not all delusions are mystical. Bernstein's phrase "a good story" is possibly misleading, since a lot of stories peddled by hucksters and cult leaders are, by any conventional literary standard, rather bad. What makes them work is not their plot but their promise: Here is an answer to the problem of how to live. Or: Here is a way to become rich beyond the dreams of avarice. In both cases, the promptings of common sense—Is it a bit odd that aliens have chosen just me and my friends to save from the destruction of America? Is it likely that Bernie Madoff has a foolproof system that can earn all his investors ten per cent a year?—are effectively obscured by the loveliness of the fantasy prospect. And, once you have entered into the delusion, you are among people who have all made the same commitment, who are all similarly intent on maintaining the lie.

The process by which people are eventually freed from their cult delusions rarely seems to be accelerated by the interventions of well-meaning outsiders. Those who embed themselves in a group idea learn very quickly to dismiss the skepticism of others as the foolish cant of the uninitiated. If we accept the premise that our beliefs are rooted in emotional attachments rather than in cool assessments of evidence, there is little reason to imagine that rational debate will break the spell.

The good news is that rational objections to flaws in cult doctrine or to hypocrisies on the part of a cult leader do have a powerful impact if and when they occur to the cult members themselves. The analytical mind may be quietened by cult-think, but it is rarely deadened altogether. Especially if cult life is proving unpleasant, the capacity for critical thought can reassert itself. Rothschild interviews several QAnon followers who became disillusioned after noticing "a dangling thread" that, once pulled, unravelled the whole tapestry of QAnon lore. It may seem unlikely that someone who has bought into the idea of Hillary Clinton drinking the blood of children can be bouleversé by, say, a trifling error in dates, but the human mind is a mysterious thing. Sometimes it is a fact remembered from grade school that unlocks the door to sanity. One of the former Scientologists interviewed in Alex Gibney's documentary "Going Clear" reports that, after a few years in the organization, she experienced her first inklings of doubt when she read L. Ron Hubbard's account of an intergalactic overlord exploding A-bombs in Vesuvius and Etna seventy-five million years ago. The detail that aroused her suspicions wasn't especially outlandish. "Whoa!" she remembers thinking. "I studied geography in school! Those volcanoes didn't exist seventy-five million years ago!"

https://www.newyorker.com/magazine/2021/07/12/what-makes-a-cult-a-cult

Jan 1, 2017

How to Convince Someone When Facts Fail

Why worldview threats undermine evidence
Why worldview threats undermine evidence

Michael Shermer
Scientific American
January 1, 2017

Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.

Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Antivaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president's long-form birth certificate in search of fraud because they believe that the nation's first African-American president is a socialist bent on destroying the country.

In these examples, proponents' deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.

In their 2007 book Mistakes Were Made (But Not by Me), two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.

In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite ... and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”

If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1 keep emotions out of the exchange, 2 discuss, don't attack (no ad hominem and no ad Hitlerum), 3 listen carefully and try to articulate the other position accurately, 4 show respect, 5 acknowledge that you understand why someone might hold that opinion, and 6 try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people's minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.

https://www.scientificamerican.com/article/how-to-convince-someone-when-facts-fail/

Nov 29, 2016

CultNEWS101 Articles: 11/30/2016

cultnews



"Ramdev delivered a discourse on yoga to hundreds of people present in the temple. He also urged to stop the sacrifice of animals at the temple and instead sacrifice the ego from one's mind and body."



The city will develop education tools and best practices for intervention of radicalized individuals. It will train experts and monitor social networks and patterns of criminal activities.




"If the deal fructifies, Patanjali's products will be sold by Amazon through its e-commerce portal in nine countries, including the US, UK and Japan, Hindustan Times
​ ​
report said."




"A South African pastor spraying his congregation with a pesticide called Doom during a 'healing session' has sparked a wave of outrage on the social media. This self-proclaimed prophet, Lethebo Rabalago, heads a church called Mount of Zion General Assembly in Limpopo province. A member of his congregation had an eye infection and he used the insecticide in an attempt to heal her."




"Where, oh where, has Lyle Jeffs gone? He’s the brother of Warren Jeffs, the autocratic guru of the Fundamentalist Church of Jesus Christ of Latter-day Saints, who is serving a life term in federal prison for sexually assaulting his “child brides.”"



"Only about 64 percent of those raised Mormon continued to adhere to the faith when they entered adulthood, according to the 2014 Pew Religious Landscape Survey. That is six percent less than the numbers in 2007. 
And for those who stay, only about 25-percent of the young, single members are actually active in the faith."





​"​
Their communication course was a leader for helping with loss for a reason. It is a very simple course improving one’s focus, patience, and ability to face and communicate with people. Those were the exact tools my personal crisis demanded. Later when I partook in auditing, Scientology’s version of psychotherapy, I had many cathartic or transcending experiences. After I left Scientology, I came to realize [their method] was really a mechanized, directed version of already existing Rogerian person-centered therapy. The “direction” additive speeds the process and adds predictability and certainty. However, it comes at an ultimately self-defeating cost. That mechanization and direction interjects the pollution of control into the process. Before too long one learns to accept control, and because of that fact, over time, he ultimately becomes owned by Scientology. If you read from Rogers’ work, it is chock full of warnings that the worst possible thing one could do with such trust-based counseling is to enter in conditions or control of any sort.
​"​





#childrenofgod

“What was your name? Who were your parents? Were you in Osaka? Switzerland?”
"Part of the problem with growing up in something so secluded as a cult is that our pasts are so unbelievable we need a witness for our own memory. And so we seek out those who remember."

"In a major setback to self- styled religious guru Asaram Bapu, the Supreme Court on Monday refused to grant him relief in connection with two rape cases that had been registered against him."




News, Intervention, Recovery

Cults101.org resources about cults, cultic groups, abusive relationships, movements, religions, political organizations and related topics.
Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.
CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.
CultNews101.com news, links, resources.
Flipboard
Twitter
Cults101 Bookstore (500 books/videos)

Selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view in order to promote dialogue.

Please forward articles that you think we should add to CultNEWS101.com.

Thanks

Jan 30, 2016

The Christmas the Aliens Didn’t Come

Julie Beck
The Atlantic
December 28, 2015

At 6 o’clock on Christmas Eve, 1954, a small group of people gathered on the street outside Dorothy Martin’s home in Oak Park, Illinois, singing Christmas carols and waiting. But this was no symbolic vigil; they weren’t waiting for the birth of baby Jesus. They were waiting to depart the Earth, and 200 more people had come to watch them wait.

A day earlier, Martin had received a message telling her the group was to wait at that place, at that time, for a flying saucer to land. They waited for 20 minutes for the “spacemen” to pick them up, as the message had promised. When none arrived, they went back inside.

This wasn’t the first time they were disappointed. It was the fourth.

It all started with a prophecy that a massive flood was coming on December 21, 1954. The message was just one of many that Martin, who was involved in Scientology and interested in flying saucers, claimed to receive from beings she called the Guardians.

“I felt a kind of tingling or numbness in my arm, and my whole arm felt warm right up to the shoulder,” she said, describing the way she would receive the messages. “Without knowing why, I picked up a pencil and a pad that were lying on the table near my bed. My hand began to write in another handwriting. I looked at the handwriting and it was strangely familiar, but I knew it was not my own. I realized that somebody else was using my hand.” The flood warning, like all the others, had flowed through her as she wrote it out, her arm possessed by these otherworldly beings.

With warnings of the coming tide came the promise that she and the other believers would be rescued by the Guardians before the flood came, on December 17. One of her most ardent supporters was Charles Laughead, a staff doctor at Michigan State in East Lansing, Michigan, who was asked to resign his position for teaching his beliefs and upsetting students. (In a Chicago Tribune article from the time, he maintained that he was fired.)

But a few of the other believers who would end up singing carols with Martin on Christmas Eve weren’t actually believers at all. They were scientists.

A team of researchers from the University of Minnesota studying social movements had learned of Martin earlier that year, and considered her and her followers a perfect field study. They began spending time with Martin in October, eventually earning her confidence, and watched how she and her followers dealt with disappointment over the next several months as their predictions repeatedly failed to pan out.

Three of the Minnesota researchers, Leon Festinger, Henry Riecken, and Stanley Schachter, recounted the believers’ story in detail in their book When Prophecy Fails, published nearly 50 years ago on January 1, 1956. The experiences of Martin and the other believers were influential on Festinger’s theory of cognitive dissonance.

According to the book, the spacemen’s arrival was originally scheduled for 4 o’clock on December 17. The believers removed all the metal from their bodies, “an act considered essential before one might safely board a saucer,” the authors write, and went out into Martin’s backyard, scanning the skies. Ten minutes went by, and then Martin, who is given the pseudonym Marian Keech in the book, “abruptly … returned to the living room.” Others trickled away, and the last believers went back inside by 5:30.

In the house, they discussed what went wrong, eventually landing on the explanation that it must have just been a practice session. “The saucers would indeed land when the time was ripe, but everyone had to be well trained, ‘well-drilled actors,’ so that when the real time arrived, things would go smoothly,” the book reads. “The spacemen were not testing their faithfulness, but were simply unwilling to leave any possibility that their human allies would make a mistake.”

Sometimes in the face of evidence against their beliefs, people will lean in to those beliefs even more. Martin got caught in this cycle.

Faced with evidence that directly contradicted their beliefs, the group experienced cognitive dissonance—two thoughts that are inconsistent. This is uncomfortable, and the natural instinct is to try to make it go away. People can do that in a few different ways: by trying to forget about the dissonant things, by changing their minds, or by looking for new information that gets rid of the contradiction.

Sometimes this can mean, as the alien-less Christmas demonstrated, people can react to evidence against their beliefs by leaning in to those beliefs even more. At midnight, when the 17th became the 18th, Martin claimed to receive a message that the flying saucer was coming right then and everybody had to get on board or be left behind. For her followers, this new message served as confirmation that they had been were right to believe. They scrambled outside, being sure to remove any remaining metal from their persons.

“We got back outside again and Edna took me aside and said, ‘How about your brassiere? It has metal clasps, doesn’t it?’” one of the observers reported. “I went back in the house and took my brassiere off. The only metal on me was the fillings in my teeth and I was afraid someone would mention those.”

They waited until 2 a.m. this time. Still no spacemen.

But the next day, the Guardians reassured Martin with a long message that repeatedly stated: “I have never been tardy; I have never kept you waiting; I have never disappointed you in anything.”

At midnight on the 21st, the scene played out again. This time, nobody but the five observers wanted to talk afterwards about what had happened. And then came the Christmas Eve disappointment, which had so many witnesses because the believers had sent out a press release about it. By this point, the cognitive dissonance was strong, as evidenced by this (condensed) conversation between Laughead (given the pseudonym Thomas Armstrong in the book) and a news reporter after the Christmas Eve debacle:

Newsman: Dr. Armstrong, I wanted to talk to you with reference to this business about—you know—you’re calling the paper to say you were going to be picked up at 6 o’clock this evening. Ahh, I just wanted to find out exactly what happened. ... Didn’t you say they sent a message that you should be packed and waiting at 6 p.m. Christmas Eve?

Armstrong: No.

Newsman: No? No, I’m sorry, sir. Weren’t the spacemen supposed to pick you up at 6 p.m.?

Armstrong: Well, there was a spaceman in the crowd with a helmet on and a white gown and what not.

Newsman: There was a spaceman in the crowd?

Armstrong: Well, it was a little hard to tell, but of course at the last when we broke up, why there was very evidently a spaceman there because he had his space helmet on and he had a big white gown on.

Newsman: And what did he say? Did you talk to him?

Armstrong: No, I didn’t talk to him.

Newsman: Didn’t you say you were going to be picked up by the spacemen?

Armstrong: No.

Newsman: Well, what were you waiting out in the street for singing carols?

Armstrong: Well, we went out to sing Christmas carols.

Newsman: Oh, you just went out to sing Christmas carols?

Armstrong: Well, and if anything happened, well, that’s all right, you know. We live from one minute to another. Some very strange things have happened to us and—

Newsman: But didn’t you hope to be picked up by the spacemen? As I understand it—

Armstrong: We were willing.

Newsman: Uhuh. Well, how do you account for the fact that they didn’t pick you up?

Armstrong: Well, as I told one of the other news boys, I didn’t think a spaceman would feel very welcome there in that crowd.

Newsman: Oh, a spaceman wouldn’t have felt welcome there.

Armstrong: No, I don’t think so. Of course, there may have been some spacemen there in disguise, you know. We couldn’t see. I think—I think that’s quite possible.

Perhaps the most powerful example of trying to reaffirm beliefs after these disappointments was on Christmas Day, when a new observer affiliated with the researchers showed up on Martin’s doorstep, attempting to gain entry into the group. Suspecting that this new visitor may be a spaceman, Martin and Laughead questioned him intensely, asking him to tell stories and seating him at a place of honor at the dinner table. But the next day, Martin got fed up, asking him, “Are you sure that you have no message for me? Now that we are alone, we can talk.”

“The experiences of this observer well characterize the state of affairs following the Christmas caroling episode—a persistent, frustrating search for orders,” Festinger and his co-authors write. After this, the believers began to disperse, leaving Martin’s home for their own, though not all of them lost their faith. Martin did not—in fact, she went on to found the Order of Sananda and Sanat Kumara (the names of two of the Guardians), calling herself “Sister Thedra.”

The lesson the researchers learned from all this, as they wrote in the introduction to When Prophecy Fails: “A man with a conviction is a hard man to change.” And when that conviction is as important as the promise salvation coming from the sky, “it may even be less painful to tolerate the dissonance than to discard the belief and admit one had been wrong.”

Julie Beck is a senior associate editor at The Atlantic, where she covers health.

http://www.theatlantic.com/health/archive/2015/12/the-christmas-the-aliens-didnt-come/421122/

Oct 4, 2015

Forum on prophecy slated at Parkside (Kenosha, WI)

Kenosha (WI) News
October 4, 2015


When Prophecy Fails
Available in
Bookstore
SOMERS — A forum on “What Happens When Prophecy Fails”? Exploring ‘End of World’ Cults” begins at noon Monday in Room D-139 of Molinaro Hall at the University of Wisconsin-Parkside, 900 Wood Road.

The program, part of the Religious Issues Forums, will feature Tony Larsen, pastor at Olympia Brown Unitarian Universalist Church in Racine.

The presentation is free and open to the public.

For more information, call Wayne G. Johnson at 262-554-1613.

http://www.kenoshanews.com/news/event_briefs_forum_on_prophecy_slated_at_parkside_484676582.html