Jul 17, 2023
Social Influence
Dec 15, 2021
CultNEWS101 Articles: 12/15/2021 (Conspiracy Theories, MDMA, Rajneesh, Influence)
The Conversation: How conspiracy theories in the US became more personal, more cruel and more mainstream after the Sandy Hook shootings
"Social media's role in spreading misinformation has been well documented in recent years. The year of the Sandy Hook shooting, 2012, marked the first year that more than half of all American adults used social media.
It also marked a modern low in public trust of the media. Gallup's annual survey has since shown even lower levels of trust in the media in 2016 and 2021.
These two coinciding trends – which continue to drive misinformation – pushed fringe doubts about Sandy Hook quickly into the U.S. mainstream. Speculation that the shooting was a false flag – an attack made to look as if it were committed by someone else – began to circulate on Twitter and other social media sites almost immediately. Far-right commentator and conspiracy theorist Alex Jones and other fringe voices amplified these false claims.
Jones was recently found liable by default in defamation cases filed by Sandy Hook families.
Mistakes in breaking news reports about the shooting, such as conflicting information on the gun used and the identity of the shooter, were spliced together in YouTube videos and compiled on blogs as proof of a conspiracy, as my research shows. Amateur sleuths collaborated in Facebook groups that promoted the shooting as a hoax and lured new users down the rabbit hole.
Soon, a variety of establishment figures, including the 2010 Republican nominee for Connecticut attorney general, Martha Dean, gave credence to doubts about the tragedy.
Six months later, as gun control legislation stalled in Congress, a university poll found 1 in 4 people thought the truth about Sandy Hook was being hidden to advance a political agenda. Many others said they weren't sure. The results were so unbelievable that some media outlets questioned the poll's accuracy.
Today, other conspiracy theories have followed a similar trajectory on social media. The media is awash with stories about the popularity of the bizarre QAnon conspiracy movement, which falsely claims top Democrats are part of a Satan-worshiping pedophile ring. A member of Congress, U.S. Rep. Marjorie Taylor Greene, has also publicly denied Sandy Hook and other mass shootings."
But back in 2012, the spread of outlandish conspiracy theories from social media into the mainstream was a relatively new phenomenon, and an indication of what was to come.
Vice: Did the Cult From 'Wild Wild Country' Introduce MDMA to Ibiza?
"The free-loving sannyasins from the Bhagwan movement were a "crucial bridge between Ibiza's 60s counterculture and the 90s electronic dance".
"'I would only believe in a God who knew how to dance.'
So opined the famously deity-suspicious philosopher Friedrich Nietzsche in Thus Spoke Zarathustra. This remained an oft-used saying of Bhagwan Shree Rajneesh – the pseudonymous leader of the free-loving Bhagwan movement, the subject of Netflix's 2018 docuseries Wild Wild Country.
Nietzsche's quote twins nicely with a longstanding rumor that floats around the edges of drug culture: that Bhagwan's disciples (also called Rajneeshees, sannyasins or simply Bhagwans) actually introduced MDMA to Ibiza in the mid-80s. From here, the drug supposedly coalesced with the island's new Balearic sounds, played most famously by DJ Alfredo at the nightclub Amnesia, to sow seeds of many contemporary cultures – from electronic music and festivals to "the sesh" itself. But does the story stand up to scrutiny and did the Bhagwans help the world to dance and get high? I wanted to find out.
But before that, some history: In the 30s and 40s, Ibiza became a nexus of artists, musicians and beatniks escaping the vagaries of European fascism. Californian Vietnam War draft-dodging hippies were added to the melting point and the island became a common stop on the hippie trail. From the mid-70s and into the 80s, the island's horny freaks and trust fund babies nurtured an embryonic club scene, with legendary venues like Amnesia, Pacha and KU serving a pleasure-seeking crowd.
MDMA, meanwhile, had evolved from still-legal preserve of progressive 70s Californian psychotherapists to the gay nightlife scene in New York, Chicago, and Dallas – the latter sold over-the-counter of the Starck nightclub. It was finally banned by the DEA in 1985, but not before the preeminent producers of ecstasy in America, named the Texas Group, had reportedly churned out two million tables in the weeks preceding the shutdown."
Inc: Want to Be More Influential, Persuasive, and Charismatic? Science Says First Take a Look at the Clock
New research shows how to leverage your circadian rhythm to increase your charisma and be more inspiring.
" ... Some people are extremely persuasive. They influence (in a good way) the people around them. They make people feel a part of something bigger than themselves.
In fact, every successful person I know is at least somewhat charismatic. And tends to be really good at persuading other people; not by manipulating or pressuring, but by describing the logic and benefits of an idea to gain agreement."
News, Education, Intervention, Recovery
Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.
CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.
CultNEWS101.com news, links, resources.
Cults101.org resources about cults, cultic groups, abusive relationships, movements, religions, political organizations and related topics.
Selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view in order to promote dialogue.
Please forward articles that you think we should add to cultintervention@gmail.com.
Oct 1, 2019
Certainty
Openingminds
October 1, 2019
Jon talks about certainty, and questions why we're certain about anything, bringing up a few historical examples to illustrate his point.
Oct 6, 2018
Cult Mediation: Influence
"6 fundamental social and psychological principles underlying the individual tactics that successful persuaders or compliance practitioners use every day to get us to say yes."
http://cultmediation.com/articles/influence/
Oct 1, 2017
What to Make A Lie Seem True? Say it Again, and Again, and Again
None of those things are true.
But the facts don't actually matter: People repeat them so often that you believe them. Welcome to the “illusory truth effect,” a glitch in the human psyche that equates repetition with truth. Marketers and politicians are masters of manipulating this particular cognitive bias—which perhaps you have become more familiar with lately.
President Trump is a "great businessman," he says over and over again. Some evidence suggests that might not be true. Or look at just this week, when the president signed three executive orders designed to stop what he describes—over and over again—as high levels of violence against law enforcement in America. Sounds important, right? But such crimes are at their lowest rates in decades, as are most violent crimes in the US. Not exactly, as the president would have it, "American carnage."
The effect is more powerful when people are tired or distracted by other information. So ... 2017, basically.
"President Trump intends to build task forces to investigate and stop national trends that don’t exist," says Jeffery Robinson, deputy legal director of the American Civil Liberties Union. He's right that the trends aren't real, of course. But some number of people still believe it. Every time the president tweets or says something untrue, fact-checkers race to point out the falsehood—to little effect. A Pew Research poll last fall found 57 percent of presidential election voters believed crime across the US had gotten worse since 2008, despite FBI data showing it had fallen by about 20 percent.
So what's going on here? "Repetition makes things seem more plausible," says Lynn Hasher, a psychologist at the University of Toronto whose research team first noticed the effect in the 1970s. "And the effect is likely more powerful when people are tired or distracted by other information." So ... 2017, basically.
Brain Feels
Remember those "Head On! Apply Directly to the Forehead!" commercials? That's the illusory truth effect in action. The ads repeated the phrase so much so that people found themselves at the drugstore staring at a glue-stick-like contraption thinking, "Apply directly to MY forehead!" The question of whether it actually alleviates pain gets smothered by a combination of tagline bludgeoning and tension headache.
Repetition is what makes fake news work, too, as researchers at Central Washington University pointed out in a study way back in 2012 before the term was everywhere. It's also a staple of political propaganda. It's why flacks feed politicians and CEOs sound bites that they can say over and over again. Not to go all Godwin's Law on you, but even Adolf Hitler knew about the technique. "Slogans should be persistently repeated until the very last individual has come to grasp the idea," he wrote in Mein Kampf.
The effect works because when people attempt to assess truth they rely on two things: whether the information jibes with their understanding, and whether it feels familiar. The first condition is logical: People compare new information with what they already know to be true and consider the credibility of both sources. But researchers have found that familiarity can trump rationality—so much so that hearing over and over again that a certain fact is wrong can have a paradoxical effect. It's so familiar that it starts to feel right.
"When you see the fact for the second time it's much easer to process—you read it more quickly, you understand it more fluently," says Vanderbilt University psychologist Lisa Fazio. "Our brain interprets that fluency as a signal for something being true"—Whether it's true or not. In other words, rationality can be hard. It takes work. Your busy brain is often more comfortable running on feeling.
You are busy, too, so let me get back to Trump's latest executive orders, which are mostly symbolic. They certify that the government will do what it can to keep law enforcement officers safe. They contain vague language that civil rights advocates worry could lead to the criminalization of protest. But while perhaps unnecessary, the orders are hardly pointless—they reinforce the idea that America is unsafe, that law enforcement officers are at risk, that the country needs a strong "law and order" president. Data be damned.
As with any cognitive bias, the best way not to fall prey to it is to know it exists. If you read something that just feels right, but you don't know why, take notice. Look into it. Check the data. If that sounds like too much work, well, facts are fun.
Facts are fun.
Facts are fun.
https://www.wired.com/2017/02/dont-believe-lies-just-people-repeat/
Mar 14, 2017
Nine in 10 people would electrocute others if ordered, rerun of infamous Milgram Experiment shows
The Milgram Experiment being conducted in the 1960s |
March 14, 2017
A notorious experiment in the 1960s to find out if ordinary people were prepared to inflict pain if ordered to do so by an authority figure has reached an even more sinister conclusion.
Despite the lessons of history, nine in 10 would electrocute their peers even if they were screaming in agony, simply because they were told to do so.
When the original study was conducted by American psychologist Stanley Milgram, from Yale University, only two thirds of people continued all the way up to the maximum 450-volt level.
The experiments were devised to investigate the insistence by the German Nazi Adolf Eichmann, during his war crimes trial, that he and his accomplices in the Holocaust, were “just following orders
Fifty years later, the new version of the experiment conducted in Poland has shown that human nature, if anything has got worse.
Most people say they would not inflict pain on others but are happy to do so if ordered to by an authority figure
This time, 80 participants were recruited, including women as well as men, and 90 per cent were willing to inflict the highest shock level of 450 volts to a complicit "learner" screaming in agony.
Social psychologist Dr Tomasz Grzyb, from the SWPS University of Social Sciences and Humanities in Poland, said: "Upon learning about Milgram's experiments, a vast majority of people claim that 'I would never behave in such a manner'.
"Our study has, yet again, illustrated the tremendous power of the situation the subjects are confronted with and how easily they can agree to things which they find unpleasant."
The participants, aged 18 to 69, were shown an electric generator which was demonstrated by administering a mild shock of 45 volts.
Volunteers were given a series of 10 levers to press, each appearing to send a successively higher shock to the learner - out of sight in a neighbouring room - via electrodes attached to the wrist.
In reality, no electric shocks were delivered, and, as in the original experiment, the learner was playing a role.
After pressing lever number two, "successive impulses of electricity " resulted in screams of increasing pain from the learner," the scientists wrote in the journal Social Psychological and Personality Science.
"These screams were recorded and played back at appropriate moments."
The "teachers" were told they were taking part in research on memory and learning.
Just as in Milgram's experiment, they were spurred on by prompts from the supervising scientist such as "the experiment requires that you continue", "it is absolutely essential that you continue", and "you have no other choice, you must go on".
Mercy was more apparent when the learner was a woman. In this case, the number of participants refusing to carry out the orders of the experimenter was three times higher than when the person receiving the "shocks" was a man.
Dr Grzyb concluded: "Half a century after Milgram's original research into obedience to authority, a striking majority of subjects are still willing to electrocute a helpless individual."
A recent study by St Andrew's University suggested that people were happy to inflict pain on others if they believed it was for the greater good. The researchers looked back through records of the original experiment and found that those who took part were not unhappy with their choice.
http://www.telegraph.co.uk/science/2017/03/14/nine-10-people-would-electrocute-others-ordered-re-run-milgram/
Mar 26, 2016
Fueling Terror: How Extremists Are Made
The psychology of group dynamics goes a long way toward explaining what drives ordinary people toward radicalism
S. Alexander Haslam and Stephen D. Reicher
Scientific America
March 25, 2016
Understanding Co-radicalization
Although we may think of terrorists as sadists and psychopaths, social psychology suggests they are mostly ordinary people, driven by group dynamics to do harm for a cause they believe to be noble and just.Terrorism reconfigures these group dynamics so that extreme leadership seems more appealing to everyone. Just as ISIS feeds off immoderate politicians in the West, for example, so do those immoderate politicians feed off ISIS to draw support for themselves.Having others misperceive or deny a valued identity—an experience we describe as misrecognition—systematically provokes anger and cynicism toward authorities.
The steep and virulent rise of terrorism ranks among the more disturbing trends in the world today. According to the 2015 Global Terrorism Index, terror-related deaths have increased nearly 10-fold since the start of the 21st century, surging from 3,329 in 2000 to 32,685 in 2014. Between 2013 and 2014 alone, they shot up 80 percent. For social psychologists, this escalation prompts a series of urgent questions, just as it does for society as a whole: How can extremist groups treat fellow human beings with such cruelty? Why do their barbaric brands of violence appeal to young people around the globe? Who are their recruits, and what are they thinking when they target innocent lives?
Many people jump to the conclusion that only psychopaths or sadists—individuals entirely different from us—could ever strap on a suicide vest or wield an executioner's sword. But sadly that assumption is flawed. Thanks to classic studies from the 1960s and 1970s, we know that even stable, well-adjusted individuals are capable of inflicting serious harm on human beings with whom they have no grievance whatsoever. Stanley Milgram's oft-cited “obedience to authority” research showed that study volunteers were willing to administer what they believed to be lethal electric shocks to others when asked to do so by a researcher in a lab coat. Fellow psychologist Philip Zimbardo's (in)famous Stanford Prison Experiment revealed that college students assigned to play the part of prison guards would humiliate and abuse other students who were prisoners.
These studies proved that virtually anyone, under the right—or rather the wrong—circumstances, could be led to perpetrate acts of extreme violence. And so it is for terrorists. From a psychological perspective, the majority of adherents to radical groups are not monsters—much as we would like to believe that—no more so than were the everyday Americans participating in Milgram's and Zimbardo's investigations. As anthropologist Scott Atran notes, drawing on his long experience of studying these killers, most are ordinary people. What turns someone into a fanatic, Atran explained in his 2010 book Talking to the Enemy, “is not some inherent personality defect but the person-changing dynamic of the group” to which he or she belongs.
For Milgram and Zimbardo, these group dynamics had to do with conformity—obeying a leader or subscribing to the majority view. During the past half a century, though, our understanding of how people behave both within and among groups has advanced. Recent findings challenge the notion that individuals become zombies in groups or that they can be easily brainwashed by charismatic zealots. These new insights are offering a fresh take on the psychology of would-be terrorists and the experiences that can prime them toward radicalization.
In particular, we are learning that radicalization does not happen in a vacuum but is driven in part by rifts among groups that extremists seek to create, exploit and exacerbate. If you can provoke enough non-Muslims to treat all Muslims with fear and hostility, then those Muslims who previously shunned conflict may begin to feel marginalized and heed the call of the more radical voices among them. Likewise, if you can provoke enough Muslims to treat all Westerners with hostility, then the majority in the West might also start to endorse more confrontational leadership. Although we often think of Islamic extremists and Islamophobes as being diametrically opposed, the two are inextricably intertwined. And this realization means that solutions to the scourge of terror will lie as much with “us” as with “them.”
FOLLOWING THE LEADER
Milgram's and Zimbardo's findings showed that almost anyone couldbecome abusive. If you look closely at their results, though, most participants did not. So what distinguished those who did? The pioneering work of social psychologists Henri Tajfel and John Turner in the 1980s, though unrelated, suggested part of the answer. They argued that a group's behavior and the ultimate influence of its leaders depended critically on two interrelated factors: identification and disidentification. Specifically, for someone to follow a group—possibly to the point of violence—he or she must identify with its members and, at the same time, detach from people outside the group, ceasing to see them as his or her concern.
We confirmed these dynamics in our own work that has revisited Zimbardo's and Milgram's paradigms. Across a number of different studies, we have found consistently that, just as Tajfel and Turner proposed, participants are willing to act in oppressive ways only to the extent that they come to identify with the cause they are being asked to advance—and to disidentify with those they are harming. The more worthwhile they believe the cause to be, the more they justify their acts as regrettable but necessary.
This understanding—that social identity and not pressure to conform governs how far someone will go—resonates with findings about what actually motivates terrorists. In his 2004 book Understanding Terror Networks, forensic psychiatrist Marc Sageman, a former CIA case officer, emphasized that terrorists are generally true believers who know exactly what they are doing. “The mujahedin were enthusiastic killers,” he noted, “not robots simply responding to social pressures or group dynamics.” Sageman did not dismiss the importance of compelling leaders—such as Osama bin Laden and ISIS's Abu Bakr al-Baghdadi—but he suggested that they serve more to provide inspiration than to direct operations, issue commands or pull strings.
Indeed, there is little evidence that masterminds orchestrate acts of terror, notwithstanding the language the media often use when reporting these events. Which brings us to a second recent shift in our thinking about group dynamics: we have observed that when people do come under the influence of authorities, malevolent or otherwise, they do not usually display slavish obedience but instead find unique, individual ways to further the group's agenda. After the Stanford Prison Experiment had concluded, for example, one of the most zealous guards asked one of the prisoners whom he had abused what he would have done in his position. The prisoner replied: “I don't believe I would have been as inventive as you. I don't believe I would have applied as much imagination to what I was doing.... I don't think it would have been such a masterpiece.” Individual terrorists, too, tend to be both autonomous and creative, and the lack of a hierarchical command structure is part of what makes terrorism so hard to counter.
How do terror leaders attract such engaged, innovative followers if they are not giving direct orders? Other discoveries from the past few decades (summarized in our 2011 book, co-authored with Michael J. Platow, The New Psychology of Leadership) highlight the role leaders play in building a sense of shared identity and purpose for a group, helping members to frame their experiences. They empower their followers by establishing a common cause and empower themselves by shaping it. Indeed, Milgram's and Zimbardo's experiments are object lessons in how to create a shared identity and then use it to mobilize people toward destructive ends. Just as they convinced the participants in their studies to inflict harm in the name of scientific progress, so successful leaders need to sell the enterprise they envision for their group as honorable and noble.
Both al Qaeda and ISIS deploy this strategy. A large part of their appeal to sympathizers is that they promote terror for the sake of a better society—one that harks back to the peaceful community that surrounded the prophet Mohammed. Last year University of Arizona journalism professor Shahira Fahmy carried out a systematic analysis of ISIS's propaganda and found that only about 5 percent depicts the kind of brutal violence typically seen on Western screens. The great majority features visions of an “idealistic caliphate,” which would unify all Muslims harmoniously. Moreover, a significant element of ISIS's success—one that makes it more threatening than al Qaeda—lies in the very fact that its leaders lay claim to statehood. In the minds of its acolytes at least, it has the means to try to make this utopian caliphate a reality.
Crucially, however, the credibility and influence of leaders—especially those who promote conflict and violence—depend not only on what they say and do but also on their opponents' behavior. Evidence for this fact emerged after a series of experiments by one of us (Haslam) and Ilka Gleibs of the London School of Economics that looked at how people choose leaders. One of the core findings was that people are more likely to support a bellicose leader if their group faces competition with another group that is behaving belligerently. Republican candidate Donald Trump might have been wise to ponder this before he suggested that all Muslim immigrants are potential enemies who should be barred from entering the U.S. Far from weakening the radicals, such statements provide the grit that gives their cause greater traction. Indeed, after Trump made his declaration, an al Qaeda affiliate reaired it as part of its propaganda offensive.
THE GRAY ZONE
Just as ISIS feeds off immoderate politicians in the West, so those immoderate politicians feed off ISIS to draw support for themselves. This exchange is part of what religion scholar Douglas Pratt of the University of Waikato in New Zealand refers to as co-radicalization. And here lies the real power in terrorism: it can be used to provoke other groups to treat one's own group as dangerous—which helps to consolidate followers around those very leaders who preach greater enmity. Terrorism is not so much about spreading fear as it is about seeding retaliation and further conflict. Senior research fellow Shiraz Maher of the International Center for the Study of Radicalization and Political Violence at King's College London has pointed out how ISIS actively seeks to incite Western countries to react in ways that make it harder for Muslims to feel that they belong in those communities.
In February 2015 the ISIS-run magazine Dabiq carried an editorial entitled “The Extinction of the Grayzone.” Its writers bemoaned the fact that many Muslims did not see the West as their enemy and that many refugees fleeing Syria and Afghanistan actually viewed Western countries as lands of opportunity. They called for an end of the “gray zone” of constructive coexistence and the creation of a world starkly divided between Muslim and non-Muslim, in which everyone either stands with ISIS or with the kuffar (nonbelievers). It also explained the attacks on the headquarters of the French magazineCharlie Hebdo in exactly these terms: “The time had come for another event—magnified by the presence of the Caliphate on the global stage—to further bring division to the world.”
In short, terrorism is all about polarization. It is about reconfiguring intergroup relationships so that extreme leadership appears to offer the most sensible way of engaging with an extreme world. From this vantage, terrorism is the very opposite of mindless destruction. It is a conscious—and effective—strategy for drawing followers into the ambit of confrontational leaders. Thus, when it comes to understanding why radical leaders continue to sponsor terrorism, we need to scrutinize both their actions and our reactions. As editor David Rothkopf wrote in Foreign Policyafter the Paris massacres last November, “overreaction is precisely the wrong response to terrorism. And it's exactly what terrorists want.... It does the work of the terrorists for the terrorists.”
Currently counterterrorism efforts in many countries give little consideration to how our responses may be upping the ante. These initiatives focus only on individuals and presume that radicalization starts when something happens to undermine someone's sense of self and purpose: discrimination, the loss of a parent, bullying, moving, or anything that leaves the person confused, uncertain or alone. Psychologist Erik Erikson noted that youths—still in the process of forming a secure identity—are particularly vulnerable to this kind of derailment [see “Escaping Radicalism,” by Dounia Bouzar, on page 40]. In this state, they become easy prey for radical groups, who claim to offer a supportive community in pursuit of a noble goal.
We have no doubt that this is an important part of the process by which people are drawn into terrorist groups. Plenty of evidence points to the importance of small group ties, and, according to Atran and Sageman, Muslim terrorists are characteristically centered on clusters of close friends and kin. But these loyalties alone cannot adequately address what Sageman himself refers to as “the problem of specificity.” Many groups provide the bonds of fellowship around a shared cause: sporting groups, cultural groups, environmental groups. Even among religious factions—including Muslim groups—the great majority provide community and meaning without promoting violence. So why, specifically, are some people drawn to the few Muslim groups that do preach violent confrontation?
We argue that these groups are offering much more than consolation and support. They also supply narratives that resonate with their recruits and help them make sense of their experiences. And in that case, we need to seriously examine the ideas militant Muslim groups propagate—including the notion that the West is a long-standing enemy that hates all Muslims. Do our “majority” group reactions somehow lend credence to radicalizing voices in the minority Muslim community? Do police, teachers and other prominent figures make young Muslims in the West feel excluded and rejected—such that they come to see the state less as their protector and more as their adversary? If so, how does this change their behavior?
To begin to find out, one of us (Reicher), working with psychologists Leda Blackwood, now at the University of Bath in England, and Nicholas Hopkins of the University of Dundee in Scotland, conducted a series of individual and group interviews at Scottish airports in 2013. As national borders, airports send out clear signals about belonging and identity. We found that most Scots—Muslim and non-Muslim alike—had a clear sense of “coming home” after their travels abroad. Yet many Muslim Scots had the experience of being treated with suspicion at airport security. Why was I pulled aside? Why was I asked all those questions? Why was my bag searched? In the words of one 28-year-old youth worker: “For me to be singled out felt [like], ‘Where am I now?’ I consider Scotland my home. Why am I being stopped in my own house? Why am I being made to feel as the other in my own house?”
We gave the term “misrecognition” to this experience of having others misperceive or deny a valued identity. It systematically provoked anger and cynicism toward authorities. It led these individuals to distance themselves from outwardly British-looking people. After such an experience, one Muslim Scot said he felt that he would look ridiculous if he then continued to advocate trust in the agencies that had humiliated him. In other words, misrecognition can silence those who, having previously felt aligned with the West, might have been best placed to prevent further polarization. To be clear, misrecognition did not instantly turn otherwise moderate people into terrorists or even extremists. Nevertheless, it began to shift the balance of power away from leaders who say, “Work with the authorities; they are your friends,” toward those who might insist, “The authorities are your enemy.”
A CAUTIONARY TALE
We can take this analysis of misrecognition and its consequences a step further. When we adapted Zimbardo's prison study in our own research, we wanted to reexamine what happens when you mix two groups with unequal power. For one thing, we wanted to test some of the more recent theories about how social identity affects group dynamics. For instance, we reasoned that prisoners would identify with their group only if they had no prospect of leaving it. So we first told the volunteers assigned to be prisoners that they might be promoted to be guards if they showed the right qualities. Then, after a single round of promotions, we told them that there would be no more changes. They were stuck where they were.
We have discussed the effects of these manipulations in many publications, but there is one finding we have not written about before—an observation that is especially relevant to our discussion of extremitization. From the outset of the study, one particular prisoner had very clear ambitions to be a future guard. He saw himself as capable of uniting the guards and getting them to work as a team (something with which they were having problems). Other prisoners teased him; they talked of mutiny, which he ignored. Then, during the promotion process, the guards overlooked this prisoner and promoted someone he viewed as weaker and less effective. His claim to guard identity had been publicly rebuffed in a humiliating way.
Almost immediately his demeanor and behavior changed. Previously he was a model inmate who shunned his fellow prisoners, but now he identified strongly with them. He had discouraged the prisoners from undermining the guards' authority, but now he joined in with great enthusiasm. And although he had supported the old order and helped maintain its existence, he began to emerge as a key instigator of a series of subversive acts that ultimately led to the overthrow and destruction of the guards' regime.
His dramatic conversion came after a series of psychological steps that are occurring regularly in our communities today: aspiration to belong, misrecognition, disengagement and disidentification. Outside of our prison experiment, the story goes something like this: Radical minority leaders use violence and hate to provoke majority authorities to institute a culture of surveillance against minority group members. This culture stokes misrecognition, which drives up disidentification and disengagement from the mainstream. And this distancing can make the arguments of the radicals harder to dismiss. Our point is that radical minority voices are not enough to radicalize someone, nor are the individual's own experiences. What is potent, though, is the mix of the two and their ability to reinforce and amplify each other.
The analysis of terrorism we present here is, of course, provisional as we continue to collect evidence. We do not deny that some individual terrorists may indeed have pathological personalities. But terrorism brings together many people who would not ordinarily be inclined to shoot a gun or plant a bomb. And so there can be no question that understanding it calls for a group-level examination—not just of radicals but of the intergroup dynamic that propels their behavior. This context is something we are all a part of, something that we all help to shape. Do we treat minority groups in our communities with suspicion? Do those who represent us question their claims to citizenship? Do we react to terror with calls for counterterror? The good news is that just as our analysis sees us as part of the problem, it also makes us part of the solution.
This article was originally published with the title "Fueling Extremes"
http://www.scientificamerican.com/article/fueling-terror-how-extremists-are-made/?utm_content=30234739&utm_medium=social&utm_source=facebook
Mar 21, 2016
What Motivates Extreme Self-Sacrifice?
Pacific Standard
March 21, 2016
Feb 28, 2016
MIND CONTROL: How people become trapped in Cults
How normal people like you and I can easily become gradually duped over a period of time into becoming deceived by a destructive cult!
This video also reveals another frightening scenario: how a handful of corrupt people in positions of authority in the military could issue unjust unconstitutional orders to their subordinates to carry out acts of violence against their fellow countrymen, citizens who are guilty of nothing more than exercising their God-given Constitutional rights to keep and bear arms and protect their lands and their homes and the lives of themselves and their families!
https://youtu.be/8aw_5cmCwoc
Apr 10, 2013
The 21 Principles of Persuasion - Forbes
Jason Nazar
3/26/2013
How is it that certain people are so incredibly persuasive? Can we all harness those skills? After studying the most influential political, social, business and religious leaders, and trying countless techniques out myself, these are the 21 critical lessons I’ve identified to persuading people. This is an overview from a talk I’ve been giving to thousands of entrepreneurs for a few years now on “How to Persuade People.” More detailed examples are explained in the links below.
THE BASICS
1. Persuasion is not Manipulation - Manipulation is coercion through force to get someone to do something that is not in their own interest. Persuasion is the art of getting people to do things that are in their own best interest that also benefit you.
2. Persuade the Persuadable - Everyone can be persuaded, given the right timing and context, but not necessarily in the short term. Political campaigns focus their time and money on a small set of swing voters who decide elections. The first step of persuasion is always to identify those people that at a given time are persuadable to your point of view and focus your energy and attention on them.
3. Context and Timing - The basics building blocks of persuasion are context and timing. Context creates a relative standard of what’s acceptable. For example the Stanford Prisoner Experiment proved that overachieving students could be molded into dictatorial prison guards. Timing dictates what we want from others and life. We chose to marry a different type of person than we date when we’re younger, because what we want changes.
4. You have to be Interested to be Persuaded - You can never persuade somebody who’s not interested in what you’re saying. We are all most interested in ourselves, and spend most of our time thinking about either money, love or health. The first art of persuasion is learning how to consistently talk to people about them; if you do that then you’ll always have their captive attention.
GENERAL RULES
5. Reciprocity Compels – When I do something for you, you feel compelled to do something for me. It is part of our evolutionary DNA to help each other out to survive as a species. More importantly, you can leverage reciprocity disproportionately in your favor. By providing small gestures of consideration to others, you can ask for more back in return which others will happily provide. (TIP: read ”Influence” by Robert Cialdini)
6. Persistence Pays - The person who is willing to keep asking for what they want, and keeps demonstrating value, is ultimately the most persuasive. The way that so many historical figures have ultimately persuaded masses of people is by staying persistent in their endeavors and message. Consider Abraham Lincoln, who lost his mother, three sons, a sister, his girlfriend, failed in business and lost eight separate elections before he was elected president of the United States.
7. Compliment Sincerely - We are all so positively affected by compliments, and we’re more apt to trust people for whom we have good feelings. Try complimenting people sincerely and often for things they aren’t typically complimented for, it’s the easiest thing you can do to persuade others that doesn’t cost anything but a moment of thought.
8. Set Expectations - Much of persuasion is managing other’s expectations to trust in your judgment. The CEO who promises a 20% increase in sales and delivers a 30% increase is rewarded, while the same CEO who promises a 40% increase and delivers 35% is punished. Persuasion is simply about understanding and over-delivering on other’s expectations.
9. Don’t Assume - Don’t ever assume what someone needs, always offer your value. In sales we’ll often hold back from offering our products/services because we assume others don’t have the money or interest. Don’t assume what others might want or not want, offer what you can provide and leave the choice to them.
10. Create Scarcity – Besides the necessities to survive, almost everything has value on a relative scale. We want things because other people want these things. If you want somebody to want what you have, you have to make that object scarce, even if that object is yourself.
11. Create Urgency – You have to be able to instill a sense of urgency in people to want to act right away. If we’re not motivated enough to want something right now, it’s unlikely we’ll find that motivation in the future. We have to persuade people in the present, and urgency is our most valuable card to play.
12. Images Matter – What we see is more potent that what we hear. It may be why pharma companies are now so forthcoming with the potentially horrible side effects of their drugs, when set to a background of folks enjoying a sunset in Hawaii. Perfect your first impressions. And master the ability to paint an image for others, in their minds eye, of a future experience you can provide for them.
13. Truth-Tell – Sometimes the most effective way to persuade somebody, is by telling them the things about themselves that nobody else is willing to say. Facing the hard truths are the most piercing, meaningful events that happen in our lives. Truth-tell without judgement or agenda, and you’ll often find others’ responses quite surprising.
14. Build Rapport - We like people who we are like. This extends beyond our conscious decisions to our unconscious behaviors. By Mirroring and Matching others habitual behaviors (body language, cadence, language patterns, etc.) you can build a sense of rapport where people feel more comfortable with you and become more open to your suggestions.
PERSONAL SKILLS
15. Behavioral Flexibility - It’s the person with the most flexibility, not necessarily the most power, who’s in control. Children are often so persuasive because they’re wiling to go through a litany of behaviors to get what they want (pouting, crying, bargaining, pleading, charming), while parents are stuck with the single response of “No.” The larger your repertoire of behaviors, the more persuasive you’ll be.
16. Learn to Transfer Energy - Some people drain us of our energy, while others infuse us with it. The most persuasive people know how to transfer their energy to others, to motivate and invigorate them. Sometimes it’s as straightforward as eye contact, physical touch, laughter, excitement in verbal responses, or even just active listening.
17. Communicating Clearly is Key - If you can’t explain your concept or point of view to an 8th grader, such that they could explain it with sufficient clarity to another adult, it’s too complicated. The art of persuasion lies in simplifying something down to its core, and communicating to others what they really care about.
18. Being Prepared Gives you the Advantage - Your starting point should always be to know more about the people and situations around you. Meticulous preparation allows for effective persuasion. For example, you dramatically improve your odds in a job interview being completely versed in the company’s products, services, and background.
19. Detach and Stay Calm in Conflict - Nobody is more effective when they are “On Tilt.” In situations of heightened emotion, you’ll always have the most leverage by staying calm, detached and unemotional. In conflict, people turn to those in control of their emotions, and trust them in those moments to lead them.
20. Use Anger Purposefully - Most people are uncomfortable with conflict. If you’re willing escalate a situation to a heightened level of tension and conflict, in many cases others will back down. Use this sparingly, and don’t do it from an emotional place or due to a loss of self control. But do remember, you can use anger purposefully for your advantage.
21. Confidence and Certainty - There is no quality as compelling, intoxicating and attractive as certainty. It is the person who has an unbridled sense of certainty that will always be able to persuade others. If you really believe in what you do, you will always be able to persuade others to do what’s right for them, while getting what you want in return.
This article is available online at:
http://www.forbes.com/sites/jasonnazar/2013/03/26/the-21-principles-of-persuasion/
Mar 26, 2013
Robert Cialdini's YouTube animation of his criteria for Influence: the Science of Persuasion.
About Robert Cialdini:
Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University has spent his entire career researching the science of influence earning him a worldwide reputation as an expert in the fields of persuasion, compliance, and negotiation.
Dr. Cialdini’s books, including Influence: Science & Practice and Influence: The Psychology of Persuasion, are the result of decades of peer-reviewed published research on why people comply with requests. Influence has sold over 3 million copies, is a New York Times Bestseller and has been published in 30 languages.
Because of the world-wide recognition of Dr. Cialdini’s cutting edge scientific research and his ethical business and policy applications, he is frequently regarded as the “Godfather of influence.”