Showing posts with label Siddha Yoga. Show all posts
Showing posts with label Siddha Yoga. Show all posts

Aug 25, 2025

CultNEWS101 Articles: 8/25/2025

Online Event, Colonia Dignidad, Chile, Siddha Yoga, Book Review, Jagadguru Shri Kripalu

Join the Webinar on August 28 at. 12.00 Psyflix Scandinavia Host: Anne Hilde Vassbø Hagen

Shame can be crippling - especially for those who have lived under psychological control in sects, extreme groups or other unhealthy communities.

On August 28th, psychologist, author and cult-survivor, Cathrine Moestue, will hold a webinar on Psyflix Scandinavia where she shares her own experiences and valuable knowledge on what it takes to regain mental health and working power after a manipulative community.

In this webinar, you will gain insight into:

"With sloping red-tiled roofs, trimmed lawns and a shop selling home-baked ginger biscuits, Villa Baviera looks like a quaint German-style village, nestled in the rolling hills of central Chile.

But it has a dark past.

Once known as Colonia Dignidad, it was home to a secretive religious sect founded by a manipulative and abusive leader who collaborated with the dictatorship of Augusto Pinochet.

Paul Schäfer, who established the colony in 1961, imposed a regime of harsh punishments and humiliation on the Germans living there.

They were separated from their parents and forced to work from a young age.

Schäfer also sexually abused many of the children.

After Gen Pinochet led a coup in 1973, opponents of his military regime were taken to Colonia Dignidad to be tortured in dark basements.

Many of these political prisoners were never seen again.

Schäfer died in prison in 2010, but some of the German residents remained and have turned the former colony into a tourist destination, with a restaurant, hotel, cabins to rent and even a boating pond.
Now, the Chilean government is going to expropriate some of its land to commemorate Pinochet's victims there. But the plans have divided opinions."

The New Leaving Siddha Yoga Site is Live
www.leavingsiddhayoga.net

• Easy to navigate and search
• Several new, compelling stories
• Info on the recent lawsuit against SYDA
• A new academic paper

Book Review: "Sex God: The Secret Life of a Dark, Dark Guru" by Karen Jonson
Karen Jonson's "Sex God: The Secret Life of a Dark, Dark Guru" is a powerful memoir and exposé that reveals the troubling truths about her former spiritual leader, Jagadguru Shri Kripalu Maharaj.

In this unflinching account, Jonson shares her experiences within a cult and details the harrowing realities faced by its victims. The book serves as a critical examination of the dangers of blind faith, the reality of cult abuse, and the resilience required to survive and speak out against such experiences.

Jonson's work is a vital resource for understanding how charisma and spiritual authority can be manipulated for harm. It is essential reading for anyone interested in cult psychology, survivor stories, and the ongoing fight for accountability within religious institutions.

This book is not for the faint of heart—it confronts the darkness that often lies beneath the surface of supposed spiritual enlightenment.



News, Education, Intervention, Recovery


CultMediation.com   

Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.

CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.

Mar 26, 2022

Heaven's Gate 25 Years Later: Former Cult Member Turned Recovery Counselor Explains How Cults Recruit

Daniel Shaw, a former member of Siddha Yoga, offers insight into how tragedies like Heaven's Gate occur

Mary Ellen Cagnassola
People
March 26, 2022

On March 26, 1997, 39 people dressed in matching black shirts, black sweatpants and Nike Decades prepared themselves to board a spacecraft trailing the impending Hale-Bopp comet and ascend to another level of human existence.

At least that's what was told to them by Marshall Applewhite, the cunning and charismatic leader of the group that called themselves Heaven's Gate. In separate groups, Applewhite and his followers mixed the barbiturate phenobarbital into applesauce and washed it down with a swig of vodka, tying plastic bags over their heads to assure asphyxiation.

Over the course of three days, the Southern California mansion that the group called home would become the site of what remains the largest mass suicide in United States history. Police would find Applewhite and the Heaven's Gate members draped in purple shrouds — save for two who were the last to die — each with five dollar bills and three quarters in their pockets.

"Being in the Heaven's Gate cult was an experience in which I gave my power away on all levels," Frank Lyford, a Heaven's Gate member who left the group before its tragic end, told PEOPLE in 2019. "I had to wake up to the fact that I had given that power away before I could wake up to the fact that I could take it back."

Heaven's Gate, now known as one of the most notorious cults, isolated its members from their loved ones and the outside world. They sustained themselves through an early-Internet web design business, Higher Source, and according to survivors who left the group, many never set foot outside their Rancho Santa Fe compound.

In honor of the 25th anniversary of the tragedy, PEOPLE sat down with Daniel Shaw, a psychoanalyst with expertise in cult recovery and a former member of the Siddha Yoga group, to talk about the warning signs of cult ideology, its modern-day iterations and how to help someone in danger.

Siddha Yoga, a spiritualist group that rose to popularity in the 1970s, was founded by the guru Swami Muktananda and later taken over by Swami Chidvilasananda, also known as Gurumayi. Still operational, its leaders have been accused of sexual abuse, pedophilia, harassment, rape and other crimes.

PEOPLE: Tell me a little about your background and your journey to helping victims of cult ideology. Do you consider yourself a survivor?

Shaw: I entered the mental health profession after spending 13 years participating in a religious group, which I came to view as an abusive cult. Once I was licensed as a mental health professional, I began to work with survivors of cults, families with loved ones in cults, and other kinds of abusive, controlling relationships and groups. My own experience with that kind of group is what has led to my working with other survivors, and I consider myself a survivor.

I was a full-time resident and worker in a religious group, where ultimately I was treated in a very abusive way and exploited. It took me more than a decade to understand what was happening. Finally, once I left, I began to study cults, and I met other cult experts, and then wrote about my experience, specifically about traumatic abuse in cults.

PEOPLE: How did you come to join Siddha Yoga?

Shaw: I grew up in a pretty secular Jewish home in New York, and there was no cult activity on anybody's part. We were a socially conscious family, progressive. I was a young adult in the '70s, when everybody was trying to recruit you into something. I managed to avoid getting recruited into anything until the end of my 20s, when I was drawn to Siddha Yoga, which was very popular with a lot of people in the arts at the time, which I was involved with. I got fully drawn into it and decided to commit myself to it and a certain point. Ten years later, I left recognizing how abusive the group really was and how abusive the leader really was.

PEOPLE: Why do people feel compelled to join these abusive groups?

Shaw: The reason anybody gets involved in a group that ultimately can be seen as a cult is that this group is promising a community of people who are committed to a meaningful purpose. In the case of Siddha Yoga, the meaningful purpose was to bring peace and enlightenment to others through meditation. My own experience, upon being introduced to the group and meditation, was absolutely phenomenal and felt incredibly different to anything I'd experienced before.

I felt connection and peace in a way that I had never felt and actually immediately improved some of the things I was struggling with — my anxiety — and it helped me feel much more motivated and more positive. So the initial impact of being introduced to meditation through this group was very powerful. Little by little, my experiences were so meaningful and powerful that I wanted to become a part of the organization, not just a visitor, and because it felt like the most meaningful thing I've ever experienced.

Most people who enter this kind of group — and it doesn't have to be religious, it can be political, it can be a business-oriented group, a self-help group — these kinds of groups attract people who are looking to be more successful, to be more productive, be more happy in their personal lives, to be able to contribute more to society. So the idealistic aims of the people who get involved are taken advantage of in these groups, because the groups themselves claim to have all of these kinds of idealistic purposes.

When a group is a cult, it's because the leader is a malignant narcissist, and these kinds of narcissists can be profound and incredibly charismatic. They make all kinds of claims for super intelligence and all kinds of accomplishments, very often those are fraudulent claims. Back when I got involved, there was no internet, so you didn't have a place to look up a group and find out its background. People who join initially believe that they're part of something very meaningful and important, that they can make a meaningful contribution to, and that they can benefit from you. The deeper you go in, really never an end to what's going to be asked of you. You're going to be asked to give and give and give. If you're not receiving what you were told you would, you'll be told that that's your fault, because you're not giving enough.

PEOPLE: How did you realize you were being taken advantage of and leave the group?

Shaw: In my own case, and in the case of most people in these groups, the participation is initially very exciting and very invigorating. You're in a group of people who are similarly highly motivated and everybody's working very hard towards the goals of making the group a success, and bringing the group's message to a wider audience. But if you get more and more deeply involved in a cultic group, what you begin to understand is that the leader is entirely self-aggrandizing and that the purpose and mission that is stated is never getting fulfilled. The only thing happening is that the leader is becoming more powerful and every way, having more control over the followers having more money, in many cases having whatever person they want to have sex with at their disposal.

Most people who spend anyone's time deeply involved in the group become exhausted and burned out. They work night and day, they're always on high alert. Everything is always a crisis. Many cult members blame themselves and their failure to be committed and motivated enough. I certainly was in that position for a while before I left, and once I left, I did so because I witnessed a great deal of abuse and cruelty and manipulation of people, including myself. When I heard a story from another follower about a young woman in the group who was being sexually abused by one of the higher-ups, who was told that it was her fault, and that she should never tell her mother. Actually, it was hearing that story that finally snapped me out.

PEOPLE: What are some warning signs that someone is becoming a victim of cult ideology?

Shaw: A group that has the characteristics of a cult, you don't necessarily have immediate exposure to the leader, you're more exposed to the followers at that point. The followers who are already involved are eager to welcome new recruits and make them feel very important and special. We call that love bombing. Once you've been recruited, and you really want to commit to being in the group, you may start to have more exposure to the leader and to other group dynamics, because the group kind of follows the behavior of the leader. That behavior in social psychology is called intermittent reinforcement. So if you're in a relationship or a group in which your experience is that you are greatly appreciated and loved and paid attention to, interspersed with being intimidated, that's the first and most important red flag.

PEOPLE: What can loved ones do to help a friend or family member who is recruited by one of these groups?

Shaw: Loved ones are in a very painful situation. They often feel helpless and unable to reach the person who's gotten involved in this kind of group. Being angry at them, confronting them, trying to prove them wrong, typically will drive people deeper into the group and further alienate them from you. So in order to try not to create further alienation, family members can try to extend themselves in a loving and empathetic way. It's very difficult because they have to hide their fear and their pain when they do that, but if it's maintaining that connection, they have a chance sooner or later — and it's often quite a bit later — for that family member to come back and realize they are loved unconditionally. That is an ideal situation.

PEOPLE: In the case of Heaven's Gate, how do these groups escalate to such a level of tragedy?

Shaw: Most groups don't end in this kind of mass suicide tragedy, but the ones that have ended in that way certainly get the most publicity, because it's the ultimate example: giving everything to the group and to its mission, including your life. Many, many people are in cults where they're giving everything just short of their actual life, and are being drained and exhausted. But when a group goes to that ultimate level, this has to do with the acute, near-schizophrenic kind of paranoia of the leader. For many group members, the leader is God, and if God is saying something, then it must be true. It's devastating for the survivors who were helpless all along to extract the loved one from the group. Law enforcement has its hands tied. We have laws about religious freedom, for example, or other kinds of freedoms.

PEOPLE: How can one spread awareness of the dangers of cult ideology in a way that avoids simply retelling its more sensational associations?

Shaw: There's a problem currently with trust in sources of information, and so many people who are currently involved in groups, such as QAnon and other splinter groups, are only receiving information from very limited sources, and are convinced that any other information is fake news. This is one of the problems we face, and people can easily just decide that they don't trust any information other than from the source that they're getting it from. There's no easy answer, but reliable information that is researched, that is backed up by evidence is available, and anybody who wants to research a group now and learn about its history, all of that information is readily available.

It's much more diffuse, and there are many branches. It's also fed by different groups who seek different aims. We don't quite have the means of fully protecting our citizens from disinformation. It's unfortunate that the internet has become one of the biggest recruiting tools in the history of the planet for actors who are creating cult-like groups.

https://people.com/crime/revisiting-heavens-gate-cult-expert-explains-how-groups-recruit/

Sep 21, 2021

CultNEWS101 Articles: 9/21/2020

Book Review, Siddha Yoga, NXIVM, Legal, Sadhvi Bhagawati Saraswati

Book Review: Daughters of the Goddess: The Women Saints of India by Joe Kelly
"The myths of India are rife with female goddesses both terrifying and placid. From the blood-filled mouth of Durga to the generous beneficence of Lakshmi, the varieties of religious experience are conveyed through graphic images. In Linda Johnsen's naïve treatise on women "saints" in India, we get a true believer's take on a few individuals who have become well known in today's spiritual marketplace. Goddess worship is embraced by many "New Age" Westerners as the cutting edge of millennial spirituality; yet, it often ignores the ancient traditions of the East. Those Westerners, both male and female, who idealize their teacher's status as divine risk getting caught up in a culture they neither understand nor have fully explored. It is often the exotic or eccentric that gets mistaken for the Divine."

Page Six: Allison Mack enjoys one of her last meals as a free woman ahead of prison stint
"Allison Mack is enjoying her last few days of freedom before she's expected to serve three years for her role in the Nxivm sex cult scandal.

"Smallville" actress Mack, 39, was snapped out and about with a new man in Long Beach, Calif. as they grabbed a bite to eat at Little Coyote Pizza Thursday.

The couple was spotted lunching, shopping and laughing while enjoying each other's company at the outdoor eatery.

Mack wore a sundress with black boots that covered her court-ordered ankle monitor while her unidentified beau kept it simple in a camel-colored shirt, black button-up and black jeans.

Back in June, Mack was sentenced to 36 months for serving as a Nxivm "slave master" that brainwashed women into becoming sex slaves for the group's twisted leader,  Keith Raniere.

"In the language of [the cult], you were a slave as well as a master … It is hard to determine an appropriate sentence for a perpetrator who is also her co-conspirator's victim," Brooklyn federal court Judge Nicholas Gaurafis told Mack during her sentencing.

NY Post: Nxivm co-founder Nancy Salzman slapped with 3 years in prison
"Nxivm's co-founder Nancy Salzman was hit with three and a half years in prison Wednesday for her unwavering support of sex cult leader Keith Raniere — and her vicious targeting of his enemies and critics.

Wearing a white blouse and black slacks, Salzman appeared in Brooklyn federal court and apologized for her senior role in the twisted group that drained its members of their cash and operated a secret sorority that groomed young women as sex slaves for Raniere.

But Salzman, 67, claimed she, too, was a victim of the sick Svengali, telling Judge Nicholas Garaufis she didn't deserve to go to prison.

But the jurist rejected her plea for leniency.

"You were Mr. Raniere's second-in-command and shared his power," railed Garaufis. "You enabled and facilitated Mr. Raniere's heinous crimes. In your 20 years at Nxivm, the door was always open but you never left."

Raniere was sentenced to 120 years in prison.

Salzman won't start serving her time until January 19 after undergoing an undisclosed medical procedure."
RNS: An American Jew turned Hindu holy woman tells her story
In September 1996, a young graduate student at Stanford University accompanied her seeker husband on a trip to the holy city of Rishikesh in India. A vegetarian who loved Indian food, she knew nothing about India or its central religious tradition: Hinduism. She had grown up Jewish.

Hot and sweaty from a day of travel and wanting to cool off by the banks of the Ganges, which Indians call Ganga, she walked down to the river to dip her toes in the water Hindus revere as a goddess.

What happened there — and at an ashram just a few feet away — was an intense spiritual experience, an awakening to the divine that changed her life forever.

Sadhvi Bhagawati Saraswati, as she is now known (she does not reveal her given name), quit her Ph.D. program, divorced her husband and became a Hindu renunciate, someone who takes vows of chastity, simplicity and nonattachment.

Now she's written a memoir about her life — "Hollywood to the Himalayas: A Journey of Healing and Transformation" — in which she reveals a less than happy Los Angeles childhood of sexual abuse at the hands of her father followed by the eating disorder bulimia.

But most of the memoir is devoted to the 25 years she has spent at the Parmarth Niketan ashram in Rishikesh working alongside its president and spiritual leader, Pujya Swami Chidanand Saraswati.

There she helped edit The Encyclopedia of Hinduism, a multivolume compendium conceived by Chidanand Saraswati and written by a group of international scholars. But she has also devoted herself to seva, a Sanskrit word meaning "selfless service." Alongside her guru, the two have undertaken multiple humanitarian and environmental projects to alleviate poverty by installing toilets, building schools and health care clinics, and providing emergency relief after natural disasters. They also travel around the world to teach about Hinduism.

Religion News Service spoke to Sadhvi Bhagawati Saraswati while she is in the U.S. to care for her mother, who suffered a stroke, as well to promote the memoir."

News, Education, Intervention, Recovery


CultEducationEvents.com

CultMediation.com   

Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.

CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.

CultNEWS101.com news, links, resources.

Facebook

Flipboard

Twitter

Instagram

Cults101.org resources about cults, cultic groups, abusive relationships, movements, religions, political organizations and related topics.


Selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view in order to promote dialogue.


Please forward articles that you think we should add to cultintervention@gmail.com.


Oct 29, 2020

CultNEWS101 Articles: 10/29/2020

NXIVM, Siddha Yoga, People of Praise, Golden Dawn, Greece, Legal 

I understand how good it feels to find a teacher who sees you.

"I'm holding back the urge to scream "Wake up!" at Alison Mack, an actor involved in NXIVM (pronounced "Nixium"), the cult depicted in HBO's documentary series "The Vow."

I grip the round seam of the sofa cushion while watching her meet her creepy guru, Keith Raniere, for the first time. She cries as he reflects back to her the way she condemns herself by limiting her feelings of bliss to art.

It's obvious to viewers that he's seducing her, but I also understand, from my psychotherapy training, that he is mirroring her, a process in which the therapist reflects something back to the client that the client hasn't been able to articulate. The stark relief of being seen is intensely powerful; it is the fulfillment of a need the client didn't know existed but had been missing all along. 

I also understand how good it feels to find a teacher who sees you. In the 1990s, I was accepted as a full-time sevite — a staff member who does selfless service — in exchange for room and board at the Siddha Yoga ashram in the Catskills."

"The Canadian starlet revealed in newly unsealed court documents that she was Nxivm leader Keith Raniere's "partner" for 10 years.

In a letter to a Brooklyn federal judge ahead of Raniere's sentencing for running a master-slave group within the upstate organization, Clyne argued that it was "absurd" to say it "was created for Keith to have sex partners" — and she should know.

"I find this idea completely absurd and even offensive — as a woman and a partner of Keith's for over a decade," Clyne wrote in a letter of support for Raniere unsealed Tuesday [10/20/2020] in Brooklyn federal court.

"I have never known Keith to want intimacy with someone who doesn't want it, and it's a ridiculous notion to think he would have gone to all that trouble for sex."

Clyne's letter was one of dozens written by former students and supporters ahead of his sentencing for sex-trafficking and other charges next week.

The 37-year-old actress has previously been identified by federal prosecutors as having been a part of Raniere's "inner circle" or "first-line masters" in the secret group, called DOS — along with her wife, former "Smallville" star Allison Mack.

Mack, purportedly Raniere's right-hand woman in DOS, has herself pleaded guilty to racketeering and conspiracy charges, including extortion and forced labor."

The Guardian: Revealed: ex-members of Amy Coney Barrett faith group tell of trauma and sexual abuse
"People of Praise hire lawyers to investigate historical sexual abuse allegations as former members speak of 'emotional torment'

Amy Coney Barrett's nomination to the supreme court has prompted former members of her secretive faith group, the People of Praise, to come forward and share stories about emotional trauma and – in at least one case – sexual abuse they claim to have suffered at the hands of members of the Christian group.

In the wake of the allegations, the Guardian has learned that the charismatic Christian organization, which is based in Indiana, has hired the law firm of Quinn Emanuel Urquhart & Sullivan to conduct an "independent investigation" into sexual abuse claims on behalf of People of Praise.

The historic sexual abuse allegations and claims of emotional trauma do not pertain specifically to Barrett, who has been a lifelong member of the charismatic group, or her family."

" ... Legal experts note role female lawyers took in confronting far-right party's violent tactics

The dark episode of Golden Dawn – its meteoric rise from being a fringe movement 40 years ago to Greece's third-biggest party on the back of protest votes over EU-dictated austerity – has raised disquieting questions.

When historians look back they will also see a nation whose political class was inexcusably slow in dealing with the rightwing menace and a society whose silence was deafening. A police force whose complicity enabled the extremists to act with impunity – until their murder of a popular anti-fascist Greek hip-hop artist, Pavlos Fyssas, provoked a backlash that was impossible to ignore – has already been illuminated by the trial. Officers who sympathised with the group, covering up attacks on leftists, migrants and refugees and the LGBTQ community, were among the hearing's 68 defendants.

Instead, it took the justice system, viewed as one of the country's few meritocratic institutions, to confront the party's violent tactics and thuggish behaviour."

News, Education, Intervention, Recovery


CultEducationEvents.com

CultMediation.com   

Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.

CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.

CultNEWS101.com news, links, resources.

Facebook

Flipboard

Twitter

Instagram

Cults101.org resources about cults, cultic groups, abusive relationships, movements, religions, political organizations and related topics.


Selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view in order to promote dialogue.


Please forward articles that you think we should add to CultNEWS101.com.


Mar 13, 2017

This Article Won’t Change Your Mind

This Article Won’t Change Your Mind
The facts on why facts alone can’t fight false beliefs facts on why facts alone can’t fight false beliefs

JULIE BECK
The Atlantic
March 13, 2017

“I remember looking at her and thinking, ‘She’s totally lying.’ At the same time, I remember something in my mind saying, ‘And that doesn’t matter.’” For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith exactly. It was something he chose. “I remember actually consciously making that choice.”

There are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you.

Back in 1980, Shaw had arrived at a Siddha Yoga meditation center in upstate New York during what he says was a “very vulnerable point in my life.” He’d had trouble with relationships, and at work, and none of the therapies he’d tried really seemed to help. But with Siddha Yoga, “my experiences were so good and meditation felt so beneficial [that] I really walked into it more and more deeply. At one point, I felt that I had found my life’s calling.” So, in 1985, he saved up money and flew to India to join the staff of Gurumayi Chidvilasananda, the spiritual leader of the organization, which had tens of thousands of followers. Shaw rose through the ranks, and spent a lot of time traveling for the organization, sometimes with Gurumayi, sometimes checking up on centers around the U.S.

But in 1994, Siddha Yoga became the subject of an exposé in The New Yorker. The article by Lis Harris detailed allegations of sexual abuse against Gurumayi’s predecessor, as well as accusations that Gurumayi forcibly ousted her own brother, Nityananda, from the organization. Shaw says he was already hearing “whispers” of sexual abuse when he joined in the 80s, but “I chose to decide that they couldn’t be true.” One day shortly after he flew to India, Shaw and the other staff members had gathered for a meeting, and Gurumayi had explained that her brother and popular co-leader was leaving the organization voluntarily. That was when Shaw realized he was being lied to. And when he decided it didn’t matter—“because she’s still the guru, and she’s still only doing everything for the best reasons. So it doesn’t matter that she’s lying.’” (For her part, Gurumayi has denied banishing her brother, and Siddha Yoga is still going strong. Gurumayi, though unnamed, is presumed to be the featured guru in Elizabeth Gilbert’s 2006 bestseller Eat, Pray, Love.)

But that was then. Shaw eventually found his way out of Siddha Yoga and became a psychotherapist. These days, he dedicates part of his practice to working with former cult members and family members of people in cults.

The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.


“A man with a conviction is a hard man to change,” Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Fails, their 1957 book about this study. “Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.”

This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as “motivated reasoning.” Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

It starts at the borders of attention—what people even allow to breach their bubbles. In a 1967 study, researchers had undergrads listen to some pre-recorded speeches, with a catch—the speeches were pretty staticky. But, the participants could press a button that reduced the static for a few seconds if they wanted to get a clearer listen. Sometimes the speeches were about smoking—either linking it to cancer, or disputing that link—and sometimes it was a speech attacking Christianity. Students who smoked were very eager to tune in to the speech that suggested cigarettes might not cause cancer, whereas nonsmokers were more likely to slam on the button for the antismoking speech. Similarly, the more-frequent churchgoers were happy to let the anti-Christian speech dissolve into static while the less religious would give the button a few presses.

Outside of a lab, this kind of selective exposure is even easier. You can just switch off the radio, change channels, only like the Facebook pages that give you the kind of news you prefer. You can construct a pillow fort of the information that’s comfortable.

Most people aren’t totally ensconced in a cushiony cave, though. They build windows in the fort, they peek out from time to time, they go for long strolls out in the world. And so, they will occasionally encounter information that suggests something they believe is wrong. A lot of these instances are no big deal, and people change their minds if the evidence shows they should—you thought it was supposed to be nice out today, you step out the door and it’s raining, you grab an umbrella. Simple as that. But if the thing you might be wrong about is a belief that’s deeply tied to your identity or worldview—the guru you’ve dedicated your life to is accused of some terrible things, the cigarettes you’re addicted to can kill you—well, then people become logical Simone Bileses, doing all the mental gymnastics it takes to remain convinced that they’re right.

People see evidence that disagrees with them as weaker, because ultimately, they’re asking themselves fundamentally different questions when evaluating that evidence, depending on whether they want to believe what it suggests or not, according to psychologist Tom Gilovich. “For desired conclusions,” he writes, “it is as if we ask ourselves ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’” People come to some information seeking permission to believe, and to other information looking for escape routes.

In 1877, the philosopher William Kingdon Clifford wrote an essay titled “The Ethics of Belief,” in which he argued: “It is wrong always, everywhere, and for anyone to believe anything on insufficient evidence.”

Lee McIntyre takes a similarly moralistic tone in his 2015 book Respecting Truth: Willful Ignorance in the Internet Age: “The real enemy of truth is not ignorance, doubt, or even disbelief,” he writes. “It is false knowledge.”

Whether it’s unethical or not is kind of beside the point, because people are going to be wrong and they’re going to believe things on insufficient evidence. And their understandings of the things they believe are often going to be incomplete—even if they’re correct. How many people who (rightly) believe climate change is real could actually explain how it works? And as the philosopher and psychologist William James noted in an address rebutting Clifford’s essay, religious faith is one domain that, by definition, requires a person to believe without proof.

Still, all manner of falsehoods—conspiracy theories, hoaxes, propaganda, and plain old mistakes—do pose a threat to truth when they spread like fungus through communities and take root in people’s minds. But the inherent contradiction of false knowledge is that only those on the outside can tell that it’s false. It’s hard for facts to fight it because to the person who holds it, it feels like truth.

At first glance, it’s hard to see why evolution would have let humans stay resistant to facts. “You don’t want to be a denialist and say, ‘Oh, that’s not a tiger, why should I believe that’s a tiger?’ because you could get eaten,” says McIntyre, a research fellow at the Center for Philosophy and History of Science at Boston University.

But from an evolutionary perspective, there are more important things than truth. Take the same scenario McIntyre mentioned and flip it on its head—you hear a growl in the bushes that sounds remarkably tiger-like. The safest thing to do is probably high-tail it out of there, even if it turns out it was just your buddy messing with you. Survival is more important than truth.

“Having social support, from an evolutionary standpoint, is far more important than knowing the truth.”
And of course, truth gets more complicated when it’s a matter of more than just “Am I about to be eaten or not?” As Pascal Boyer, an anthropologist and psychologist at Washington University in St. Louis points out in his forthcoming book The Most Natural Thing: How Evolution Explains Human Societies: “The natural environment of human beings, like the sea for dolphins or the ice for polar bears, is information provided by others, without which they could not forage, hunt, choose mates, or build tools. Without communication, no survival for humans.”

In this environment, people with good information are valued. But expertise comes at a cost—it requires time and work. If you can get people to believe you’re a good source without actually being one, you get the benefits without having to put in the work. Liars prosper, in other words, if people believe them. So some researchers have suggested motivated reasoning may have developed as a “shield against manipulation.” A tendency to stick with what they already believe could help protect people from being taken in by every huckster with a convincing tale who comes along.

“This kind of arms-race between deception and detection is common in nature,” Boyer writes.

Spreading a tall tale also gives people something even more important than false expertise—it lets them know who’s on their side. If you accuse someone of being a witch, or explain why you think the contrails left by airplanes are actually spraying harmful chemicals, the people who take you at your word are clearly people you can trust, and who trust you. The people who dismiss your claims, or even those who just ask how you know, are not people you can count on to automatically side with you no matter what.

“You spread stories because you know that they’re likely to be a kind of litmus test, and the way people react will show whether they’re prepared to side with you or not,” Boyer says. “Having social support, from an evolutionary standpoint, is far more important than knowing the truth about some facts that do not directly impinge on your life.” The meditation and sense of belonging that Daniel Shaw got from Siddha Yoga, for example, was at one time more important to his life than the alleged misdeeds of the gurus who led the group.

Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming, and Shaw held onto his reverence for his guru, because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

Shaw describes the motivated reasoning that happens in these groups: “You’re in a position of defending your choices no matter what information is presented,” he says, “because if you don’t, it means that you lose your membership in this group that’s become so important to you.” Though cults are an intense example, Shaw says people act the same way with regard to their families or other groups that are important to them.

And in modern America, one of the groups that people have most intensely hitched their identities to is their political party. Americans are more politically polarized than they’ve been in decades, possibly ever. There isn’t public-opinion data going back to the Federalists and the Democratic Republicans, of course. But political scientists Keith Poole and Howard Rosenthal look at the polarization in Congress. And the most recent data shows that 2015 had the highest rates of polarization since 1879, the earliest year for which there’s data. And that was even before well, you know.

Party Polarization, 1879-2015



Keith T. Poole and Howard Rosenthal, voteview.com
Now, “party is a stronger part of our identity,” says Brendan Nyhan, a professor of government at Dartmouth College. “So it’s easy to see how we can slide into a sort of cognitive tribalism.”

Though as the graph above shows, partisanship has been on the rise in the United States for decades, Donald Trump’s election, and even his brief time as president, have made partisanship and its relationship to facts seem like one of the most urgent questions of the era. In the past couple of years, fake news stories perfectly crafted to appeal to one party or the other have proliferated on social media, convincing people that the Pope had endorsed Trump or that Rage Against the Machine was reuniting for an anti-Trump album. While some studies suggest that conservatives are more susceptible to fake news—one fake news creator told NPR that stories he’d written targeting liberals never gained as much traction—after the election, the tables seem to have turned. As my colleague Robinson Meyer reported, in recent months there’s been an uptick in progressive fake news, stories that claim Trump is about to be arrested or that his administration is preparing for a coup.

Though both Hillary Clinton and Donald Trump were disliked by members of their own parties—with a “Never Trump” movement blooming within the Republican Party—ultimately most people voted along party lines. Eighty-nine percent of Democrats voted for Clinton and 88 percent of Republicans voted for Trump, according to CNN’s exit polls.

Carol Tavris, a social psychologist and co-author of Mistakes Were Made, But Not by Me, says that for Never Trump Republicans, it must have been “uncomfortable to them to feel they could not be wholeheartedly behind their candidate. You could hear the dissonance humming within them. We had a year of watching with interest as Republicans struggled to resolve this. Some resolved it by: ‘Never Trump but never Hillary, either.’ Others resolved it by saying, ‘I’m going to hold my nose and vote for him because he’s going to do the things that Republicans do in office.’”

“Partisanship has been revealed as the strongest force in U.S. public life—stronger than any norms, independent of any facts,” Vox’s David Roberts wrote in his extensive breakdown of the factors that influenced the election. The many things that, during the campaign, might have seemed to render Trump unelectable—boasting about sexual assault, encouraging violence at his rallies, attacking an American-born judge for his Mexican heritage—did not ultimately cost him the support of the majority of his party. Republican commentators and politicians even decried Trump as not a true conservative. But he was the Republican nominee, and he rallied the Republican base.

In one particularly potent example of party trumping fact, when shown photos of Trump’s inauguration and Barack Obama’s side by side, in which Obama clearly had a bigger crowd, some Trump supporters identified the bigger crowd as Trump’s. When researchers explicitly told subjects which photo was Trump’s and which was Obama’s, a smaller portion of Trump supporters falsely said Trump’s photo had more people in it.

While this may appear to be a remarkable feat of self-deception, Dan Kahan thinks it’s likely something else. It’s not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump. “People knew what was being done here,” says Kahan, a professor of law and psychology at Yale University. “They knew that someone was just trying to show up Trump or trying to denigrate their identity.” The question behind the question was, “Whose team are you on?”

In these charged situations, people often don’t engage with information as information but as a marker of identity. Information becomes tribal.

In a New York Times article called “The Real Story About Fake News Is Partisanship,” Amanda Taub writes that sharing fake news stories on social media that denigrate the candidate you oppose “is a way to show public support for one’s partisan team—roughly the equivalent of painting your face with team colors on game day.”

This sort of information tribalism isn’t a consequence of people lacking intelligence or of an inability to comprehend evidence. Kahan has previously written that whether people “believe” in evolution or not has nothing to do with whether they understand the theory of it—saying you don’t believe in evolution is just another way of saying you’re religious. Similarly, a recent Pew study found that a high level of science knowledge didn’t make Republicans any more likely to say they believed in climate change, though it did for Democrats.

What’s more, being intelligent and informed can often make the problem worse. The higher someone’s IQ, the better they are at coming up with arguments to support a position—but only a position they already agree with, as one study showed. High levels of knowledge make someone more likely to engage in motivated reasoning—perhaps because they have more to draw on when crafting a counterargument.

People also learn selectively—they’re better at learning facts that confirm their worldview than facts that challenge it. And media coverage makes that worse. While more news coverage of a topic seems to generally increase people’s knowledge of it, one paper, “Partisan Perceptual Bias and the Information Environment,” showed that when the coverage has implications for a person’s political party, then selective learning kicks into high gear.

“You can have very high levels of news coverage of a particular fact or an event and you see little or no learning among people who are motivated to disagree with that piece of information,” says Jennifer Jerit, a professor of political science at Stony Brook University and a co-author of the partisan-perception study. “Our results suggest that extraordinary levels of media coverage may be required for partisans to incorporate information that runs contrary to their political views,” the study reads. For example, Democrats are overwhelmingly supportive of bills to ban the chemical BPA from household products, even though the FDA and many scientific studies have found it is safe at the low levels currently used. This reflects a “chemophobia” often seen among liberals, according to Politico.

Fact-checking erroneous statements made by politicians or cranks may also be ineffective. Nyhan’s work has shown that correcting people’s misperceptions often doesn’t work, and worse, sometimes it creates a backfire effect, making people endorse their misperceptions even more strongly.

Sometimes during experimental studies in the lab, Jerit says, researchers have been able to fight against motivated reasoning by priming people to focus on accuracy in whatever task is at hand, but it’s unclear how to translate that to the real world, where people wear information like team jerseys. Especially because a lot of false political beliefs have to do with issues that don’t really affect people’s day-to-day lives.

“Most people have no reason to have a position on climate change aside from expression of their identity,” Kahan says. “Their personal behavior isn’t going to affect the risk that they face. They don't matter enough as a voter to determine the outcome on policies or anything like this. These are just badges of membership in these groups, and that’s how most people process the information.”

In 2016, Oxford Dictionaries chose “post-truth” as its word of the year, defined as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”

It was a year when the winning presidential candidate lied almost constantly on the campaign trail, when fake news abounded, and when people cocooned themselves thoroughly in social-media spheres that only told them what they wanted to hear. After careening through a partisan hall of mirrors, the “facts” that came through were so twisted and warped that Democrats and Republicans alike were accused of living in a “filter bubble,” or an “echo chamber,” or even an “alternate reality.”

Farhad Manjoo’s book, True Enough: Learning to Live in a Post-Fact Society, sounds like it could have come out yesterday—with its argument about how the media is fragmenting, how belief beats out fact, and how objective reality itself gets questioned—but it was actually published in 2008.

“Around the time [the book] came out, I was a little bit unsure how speculative and how real the idea was,” says Manjoo, who is now a technology columnist for The New York Times. “One of my arguments was, in politics, you don’t pay a penalty for lying.” At the time, a lot of lies were going around about presidential candidate Barack Obama—that he was a Muslim, that he wasn’t born in the United States—lies that did not ultimately sink him.

“Here was a person who was super rational, and believed in science, and was the target of these factless claims, but won anyway,” Manjoo says. “It really seemed like that election was a vindication of fact and truth, which in retrospect, I think it was just not.”

There was plenty of post-truth to go around during the Obama administration, whether it was the birther rumors (famously perpetuated by the current president) that just wouldn’t die, or the debate over the nonexistent “death panels” in the Affordable Care Act.

“I started to get a sense that my idea was probably realer than I thought,” Manjoo says. “And then you had the 2016 election, which confirmed every worst fear of mine.”

But the problem, Nyhan says, with “post-truth, post-fact language is it suggests a kind of golden age that never existed in which political debate was based on facts and truth.”

People have always been tribal and have always believed things that aren’t true. Is the present moment really so different, or do the stakes just feel higher?

Partisanship has surely ramped up—but Americans have been partisan before, to the point of civil war. Today’s media environment is certainly unique, though it’s following some classic patterns. This is hardly the first time there have been partisan publications, or many competing outlets, or even information silos. People often despair at the loss of the mid-20th-century model, when just a few newspapers and TV channels fed people most of their unbiased news vegetables. But in the 19th century, papers were known for competing for eyeballs with sensational headlines, and in the time of the Founding Fathers, Federalist and Republican papers were constantly sniping at each other. In times when communication wasn’t as easy as it is now, news was more local—you could say people were in geographical information silos. The mid-20th-century “mainstream media” was an anomaly.

The situation now is in some ways a return to the bad old days of bias and silos and competition, “but it’s like a supercharged return,” Manjoo says. “It’s not just that I’m reading news that confirms my beliefs, but I’m sharing it and friending other people, and that affects their media. I think it’s less important what a news story says than what your friend says about the news story.” These silos are also no longer geographical, but ideological and thus less diverse. A recent study in the Proceedings of the National Academy of Sciences that analyzed 376 million Facebook users’ interactions with 900 news outlets reports that “selective exposure drives news consumption.”

Not everyone, however, agrees that the silos exist. Kahan says he’s not convinced: “I think that people have a preference for the sources that support their position. That doesn’t mean that they're never encountering what the other side is saying.” They’re just dismissing it when they do.

The sheer scale of the internet allows you to find evidence (if sometimes dubious evidence) for any claim you want to believe, and counterevidence against any claim you don’t want to have to believe. And because humans didn’t evolve to operate in such a large sea of people and information, Boyer says people can be fooled into thinking some ideas are more widespread than they really are.

“When I was doing fieldwork in small villages in Africa, I've seen examples of people who have a strange belief,” he says. “[For example], they think that if they recite an incantation they can make a small object disappear. Now, most people around them just laugh and tell them that’s stupid. And that’s it. And the belief kind of disappears.”

But as a community gets larger, the likelier it is that a person can find someone else who shares their strange belief. And if the “community” is everyone in the world with an internet connection who speaks your language, well.

“If you encounter 10 people who seem to have roughly the same idea, then it fools your system into thinking that it must be a probable idea because lots of people agree with it,” Boyer says. “One thing you assume, unconsciously, is that these 10 people came to the same belief independently. You don’t think that nine of these are just repeating something that the 10th one said.”

Part of the problem is that society has advanced to the point that believing what’s true often means accepting things you don’t have any firsthand experience of and that you may not completely understand. Sometimes it means disbelieving your own senses—Earth doesn’t feel like it’s moving, after all, and you can’t see climate change out your window.

In areas where you lack expertise, you have to rely on trust. Even Clifford acknowledges this—it’s acceptable, he says, to believe what someone else tells you “when there is reasonable ground for supposing that he knows the matter of which he speaks.”

The problem is that who and what people trust to give them reliable information is also tribal. Deferring to experts might seem like a good start, but Kahan has found that people see experts who agree with them as more legitimate than experts who don’t.

In the United States, people are less generally trusting of each other than they used to be. Since 1972, the General Social Survey has asked respondents: “Generally speaking, would you say that most people can be trusted or that you can’t be too careful in dealing with people?” As of 2014, the most recent data, the number of people saying most others can be trusted was at a historic low.

Percent of Americans Who Say Most People Can Be Trusted

On the other hand, there’s “particularized trust”—specifically, the trust you have for people in your groups. “Particularized trust destroys generalized trust,” Manjoo wrote in his book. “The more that people trust those who are like themselves—the more they trust people in their own town, say—the more they distrust strangers.”

This fuels tribalism. “Particularized trusters are likely to join groups composed of people like themselves—and to shy away from activities that involve people they don’t see as part of their moral community,” writes Eric Uslaner, a professor of government and politics at the University of Maryland, College Park.

So people high on the particularized-trust scale would be more likely to believe information that comes from others in their groups, and if those groups are ideological, the people sharing that information probably already agree with them. And so it spirals.

This is also a big part of why people don’t trust the media. Not that news articles are never biased, but a hypothetical perfectly evenhanded piece of journalism, that fairly and neutrally represented all sides would still likely be seen as biased by people on each side. Because, Manjoo writes, everyone thinks their side has the best evidence, and therefore if the article were truly objective, it would have emphasized their side more.

This is the attitude Trump has taken toward the media, calling any unfavorable coverage of him—even if it’s true—“unfair” and “fake news.” On the other hand, outlets that are biased in his favor, like Fox and Friends and the pro-Trump conservative blog The Gateway Pundit, Trump bills as “very honorable” and he invites them to the White House. (This is a reversal of fortune for Fox, which got a similar “fake news” style brush-off in 2009, when Obama’s communications director said the administration wouldn’t “legitimize them as a news organization.”) Trump’s is an extreme, id-fueled version of particularized trust, to be sure, but it’s akin to a mind-set many are prone to. Objectivity is a valiant battle, but sometimes, a losing one.

“Alternative facts” is a phrase that will live in infamy. Trump counselor Kellyanne Conway famously used it to describe White House Press Secretary Sean Spicer’s lie that Trump’s inauguration had drawn the “largest audience to ever witness an inauguration—period.”

Spicer has also said to reporters, “I think sometimes we can disagree with the facts.”

These are some of the more explicit statements from an administration that shows in ways subtle and not-at-all subtle that it often does not, as McIntyre would put it, “respect the truth.” This sort of flippant disregard for objective reality is deeply troubling, but the extreme nature of it also exposes more clearly something that’s always been true about politics: that sometimes when we argue about the facts, we’re not arguing about the facts at all.

The experiment where Trump supporters were asked about the inauguration photos is one example. In a paper on political misperceptions, Nyhan suggests another: a survey asking people whether they agree with the statement “The murder rate in the United States is the highest it’s been in 45 years,” something Trump often said on the campaign trail, as well as something that’s not true. “Because the claim is false,” Nyhan writes, “the most accurate response is to disagree. But what does it mean if a person agrees with the statement?”

It becomes unclear whether the person really believes that the false statement is true, or whether they’re using it as a shortcut to express something else—their support for Trump regardless of the validity of his claims, or just the fact that they feel unsafe and they’re worried about crime. Though for the media outlets that are fact-checking these things, it’s a matter of truth and falsehood, for the ordinary person evaluating, adopting, rejecting, or spreading false beliefs, that may not be what it’s really about.

Sometimes when we argue about the facts, we’re not arguing about the facts at all.
These are more often disputes over values, Kahan says, about what kind of society people want and which group or politician aligns with that. “Even if a fact is corrected, why is that going to make a difference?” he asks. “That’s not why they were supporting the person in the first place.”

So what would get someone to change their mind about a false belief that is deeply tied to their identity?

“Probably nothing,” Tavris says. “I mean that seriously.”

But of course there are areas where facts can make a difference. There are people who are just mistaken or who are motivated to believe something false without treasuring the false belief like a crown jewel.

“Personally my own theory is that there’s a slide that happens,” McIntyre says. “This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet. I think once they’ve hit denial, they’re too far gone and there’s not a lot you can do to save them.”

There are small things that could help. One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. And in the study at least, it worked.

While there’s no erasing humans’ tribal tendencies, muddying the waters of partisanship could make people more open to changing their minds. “We know people are less biased if they see that policies are supported by a mix of people from each party,” Jerit says. “It doesn’t seem like that’s very likely to happen in this contemporary period, but even to the extent that they see within party disagreement, I think that is meaningful. Anything that's breaking this pattern where you see these two parties acting as homogeneous blocks, there’s evidence that motivated reasoning decreases in these contexts.”

It’s also possible to at least imagine a media environment that’s less hospitable to fake news and selective exposure than our current one, which relies so heavily on people’s social-media networks.

I asked Manjoo what a less fake-newsy media environment might look like.

“I think we need to get to an information environment where sharing is slowed down,” Manjoo says. “A really good example of this is Snapchat. Everything disappears after a day—you can’t have some lingering thing that gets bigger and bigger.”

Facebook is apparently interested in copying some of Snapchat’s features—including the disappearing messages. “I think that would reduce virality, and then you could imagine that would perhaps cut down on sharing false information,” Manjoo says. But, he caveats: “Things must be particularly bad if you’re looking at Snapchat for reasons of hope.”

So much of how people view the world has nothing to do with facts. That doesn’t mean truth is doomed, or even that people can’t change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says. They have to change their own.

As previously noted, Daniel Shaw ultimately left Siddha Yoga. But it took a long time. “Before that [New Yorker] article came out,” he says, “I started to learn about what was going to be in that article, and the minute I heard it is the minute I left that group, because immediately it all clicked together. But it had taken at least five years of this growing unease and doubt, which I didn’t want to know about or face.”

It seems like if people are going to be open-minded, it’s more likely to happen in group interactions. As Manjoo noted in his book, when the U.S. government was trying to get people to eat organ meat during World War II (you know, to save the good stuff for our boys), researchers found that when housewives had a group discussion about it, rather than just listening to a nutritionist blather on about what a good idea it was, they were five times more likely to actually cook up some organs. And groups are usually better at coming up with the correct answers to reasoning tasks than individuals are.

Of course, the wisdom of groups is probably diminished if everyone in a group already agrees with each other.

“One real advantage of group reasoning is that you get critical feedback,” McIntyre says. “If you’re in a silo, you don’t get critical feedback, you just get applause.”

But if the changes are going to happen at all, it’ll have to be “on a person-to-person level,” Shaw says.

He tells me about a patient of his, whose family is involved in “an extremely fundamentalist Christian group. [The patient] has come to see a lot of problems with the ideology and maintains a relationship with his family in which he tries to discuss in a loving and compassionate way some of these issues,” Shaw says. “He is patient and persistent, and he chips away, and he may succeed eventually.”

“But are they going to listen to a [news] feature about why they’re wrong? I don’t think so.”

When someone does change their mind, it will probably be more like the slow creep of Shaw’s disillusionment with his guru. He left “the way most people do: Sort of like death by a thousand cuts.”

https://www.theatlantic.com/science/archive/2017/03/this-article-wont-change-your-mind/519093/

Mar 16, 2014

Response to SYDA's letter denouncing the Salon.com article about Eat Pray Love and Siddha Yoga

Tuesday, August 17, 2010

Deniers

I personally knew many of the people who have commented on this letter, and who were and are trustees. I personally know that they know the facts about Siddha Yoga. I know that they know the following:

  • that Swami Muktananda was a sexual predator who molested scores of women, including minor girls, lied about it, and threatened those who told the truth with violence;
  • that Gurumayi, Muktananda's successor after his death, encouraged and enjoyed a campaign of harassment and violence against her brother when she wanted to remove him from power within the organization;
  • that Gurumayi routinely lied to her followers, had her followers spied upon, and publicly humiliated followers by revealing information they had shared with her privately;
  • that Gurumayi blamed young women in the ashram for what happened to them when they were seduced and molested by male leaders there, and protected and defended the molesters.

These are just a few of the more concrete abuses that can be cited. More difficult to describe is the environment of intimidation, the belligerence, the control over every aspect of the followers lives, the exploitation of workers who are expected to work endless hours without pay or benefits and yet made to feel guilty and ashamed for never giving enough.

Additionally, the premise that there is a human being, in this case Gurumayi, who is self-proclaimed to be a "fully realized master" means what? that everyone else who is not a "realized master" is inferior to her? Such a premise is simply a means by which Gurumayi profits through the subjugation of others who have come to believe that they too can have a little piece of that superior status. All you have to do is follow the leader, no matter where she leads.

Sure, VIPs, or SCs as they are known in the ashram (Special Consideration guests) like Elizabeth Gilbert, wouldn't see any of this or have a clue. They do not have exposure to the hidden world of the inner circles around Gurumayi. The rich and famous get the sanitized Siddha Yoga. Folks without money willing to devote their lives to what they think of as a true religion get a very different experience.

I know that these trustees and many of these followers personally know of all these abuses and so much more deception, corruption and abuse, from the earliest days of Swami Muktananda, to the present. That they continue to choose to deny these facts is tragic, but also despicable. SYDA sells spiritual enlightenment through devotion to the guru, at a very steep price: your integrity, your moral values, and your independent and critical thinking. And after you give all those things up, and delude yourself into thinking you are still aspiring toward enlightenment, your only choices are either to tell the truth and leave Siddha Yoga - or stay and become a denier and defender of abuse, exploitation and corruption.  The facts that SYDA and its apologists want to deny are readily available in media articles and personal testimonies. See the website that I maintain at www.leavingsiddhayoga.net.

Since I left Siddha Yoga, I receive an endless stream of hate mail from the deniers - I'm sure more of that will follow this post. That's what Siddha Yoga spirituality is about - vilify critics, deny abuse, and keep filling those Swiss bank accounts. Caveat Emptor!

Daniel Shaw Nyack, NY