Showing posts with label influence. Show all posts
Showing posts with label influence. Show all posts

Jul 17, 2023

Social Influence

Social influence is the process by which an individual's thoughts, feelings, and behaviors are affected by other people. It can take many forms, including conformity, socialization, peer pressure, obedience, leadership, persuasion, sales, and marketing.

Dec 15, 2021

CultNEWS101 Articles: 12/15/2021 (Conspiracy Theories, MDMA, Rajneesh, Influence)

Conspiracy Theories, MDMA, Rajneesh, Influence

The Conversation: How conspiracy theories in the US became more personal, more cruel and more mainstream after the Sandy Hook shootings
"Social media's role in spreading misinformation has been well documented in recent years. The year of the Sandy Hook shooting, 2012, marked the first year that more than half of all American adults used social media.

It also marked a modern low in public trust of the media. Gallup's annual survey has since shown even lower levels of trust in the media in 2016 and 2021.

These two coinciding trends – which continue to drive misinformation – pushed fringe doubts about Sandy Hook quickly into the U.S. mainstream. Speculation that the shooting was a false flag – an attack made to look as if it were committed by someone else – began to circulate on Twitter and other social media sites almost immediately. Far-right commentator and conspiracy theorist Alex Jones and other fringe voices amplified these false claims.

Jones was recently found liable by default in defamation cases filed by Sandy Hook families.

Mistakes in breaking news reports about the shooting, such as conflicting information on the gun used and the identity of the shooter, were spliced together in YouTube videos and compiled on blogs as proof of a conspiracy, as my research shows. Amateur sleuths collaborated in Facebook groups that promoted the shooting as a hoax and lured new users down the rabbit hole.

Soon, a variety of establishment figures, including the 2010 Republican nominee for Connecticut attorney general, Martha Dean, gave credence to doubts about the tragedy.

Six months later, as gun control legislation stalled in Congress, a university poll found 1 in 4 people thought the truth about Sandy Hook was being hidden to advance a political agenda. Many others said they weren't sure. The results were so unbelievable that some media outlets questioned the poll's accuracy.

Today, other conspiracy theories have followed a similar trajectory on social media. The media is awash with stories about the popularity of the bizarre QAnon conspiracy movement, which falsely claims top Democrats are part of a Satan-worshiping pedophile ring. A member of Congress, U.S. Rep. Marjorie Taylor Greene, has also publicly denied Sandy Hook and other mass shootings."

But back in 2012, the spread of outlandish conspiracy theories from social media into the mainstream was a relatively new phenomenon, and an indication of what was to come.  

Vice: Did the Cult From 'Wild Wild Country' Introduce MDMA to Ibiza?
"The free-loving sannyasins from the Bhagwan movement were a "crucial bridge between Ibiza's 60s counterculture and the 90s electronic dance".

"'I would only believe in a God who knew how to dance.'

So opined the famously deity-suspicious philosopher Friedrich Nietzsche in Thus Spoke Zarathustra. This remained an oft-used saying of Bhagwan Shree Rajneesh – the pseudonymous leader of the free-loving Bhagwan movement, the subject of Netflix's 2018 docuseries Wild Wild Country.

Nietzsche's quote twins nicely with a longstanding rumor that floats around the edges of drug culture: that Bhagwan's disciples (also called Rajneeshees, sannyasins or simply Bhagwans) actually introduced MDMA to Ibiza in the mid-80s. From here, the drug supposedly coalesced with the island's new Balearic sounds, played most famously by DJ Alfredo at the nightclub Amnesia, to sow seeds of many contemporary cultures – from electronic music and festivals to "the sesh" itself. But does the story stand up to scrutiny and did the Bhagwans help the world to dance and get high? I wanted to find out.

But before that, some history: In the 30s and 40s, Ibiza became a nexus of artists, musicians and beatniks escaping the vagaries of European fascism. Californian Vietnam War draft-dodging hippies were added to the melting point and the island became a common stop on the hippie trail. From the mid-70s and into the 80s, the island's horny freaks and trust fund babies nurtured an embryonic club scene, with legendary venues like Amnesia, Pacha and KU serving a pleasure-seeking crowd.

MDMA, meanwhile, had evolved from still-legal preserve of progressive 70s Californian psychotherapists to the gay nightlife scene in New York, Chicago, and Dallas – the latter sold over-the-counter of the Starck nightclub. It was finally banned by the DEA in 1985, but not before the preeminent producers of ecstasy in America, named the Texas Group, had reportedly churned out two million tables in the weeks preceding the shutdown."

Inc: Want to Be More Influential, Persuasive, and Charismatic? Science Says First Take a Look at the Clock
New research shows how to leverage your circadian rhythm to increase your charisma and be more inspiring.

" ... Some people are extremely persuasive. They influence (in a good way) the people around them. They make people feel a part of something bigger than themselves.

In fact, every successful person I know is at least somewhat charismatic. And tends to be really good at persuading other people; not by manipulating or pressuring, but by describing the logic and benefits of an idea to gain agreement."


News, Education, Intervention, Recovery


CultEducationEvents.com

CultMediation.com   

Intervention101.com to help families and friends understand and effectively respond to the complexity of a loved one's cult involvement.

CultRecovery101.com assists group members and their families make the sometimes difficult transition from coercion to renewed individual choice.

CultNEWS101.com news, links, resources.

Facebook

Flipboard

Twitter

Instagram

Cults101.org resources about cults, cultic groups, abusive relationships, movements, religions, political organizations and related topics.


Selection of articles for CultNEWS101 does not mean that Patrick Ryan or Joseph Kelly agree with the content. We provide information from many points of view in order to promote dialogue.


Please forward articles that you think we should add to cultintervention@gmail.com.


Oct 1, 2019

Certainty


Openingminds
October 1, 2019

Jon talks about certainty, and questions why we're certain about anything, bringing up a few historical examples to illustrate his point.

Oct 6, 2018

Cult Mediation: Influence

Cult Mediation: Influence

"6 fundamental social and psychological principles underlying the individual tactics that successful persuaders or compliance practitioners use every day to get us to say yes."

http://cultmediation.com/articles/influence/

Oct 1, 2017

What to Make A Lie Seem True? Say it Again, and Again, and Again

What to Make A Lie Seem True? Say it Again, and Again, and Again
EMILY DREYFUSS
Wired
February 11, 2017

YOU ONLY USE 10 percent of your brain. Eating carrots improves your eyesight. Vitamin C cures the common cold. Crime in the United States is at an all-time high.

None of those things are true.

But the facts don't actually matter: People repeat them so often that you believe them. Welcome to the “illusory truth effect,” a glitch in the human psyche that equates repetition with truth. Marketers and politicians are masters of manipulating this particular cognitive bias—which perhaps you have become more familiar with lately.

President Trump is a "great businessman," he says over and over again. Some evidence suggests that might not be true. Or look at just this week, when the president signed three executive orders designed to stop what he describes—over and over again—as high levels of violence against law enforcement in America. Sounds important, right? But such crimes are at their lowest rates in decades, as are most violent crimes in the US. Not exactly, as the president would have it, "American carnage."

The effect is more powerful when people are tired or distracted by other information. So ... 2017, basically.


"President Trump intends to build task forces to investigate and stop national trends that don’t exist," says Jeffery Robinson, deputy legal director of the American Civil Liberties Union. He's right that the trends aren't real, of course. But some number of people still believe it. Every time the president tweets or says something untrue, fact-checkers race to point out the falsehood—to little effect. A Pew Research poll last fall found 57 percent of presidential election voters believed crime across the US had gotten worse since 2008, despite FBI data showing it had fallen by about 20 percent.

So what's going on here? "Repetition makes things seem more plausible," says Lynn Hasher, a psychologist at the University of Toronto whose research team first noticed the effect in the 1970s. "And the effect is likely more powerful when people are tired or distracted by other information." So ... 2017, basically.

Brain Feels


Remember those "Head On! Apply Directly to the Forehead!" commercials? That's the illusory truth effect in action. The ads repeated the phrase so much so that people found themselves at the drugstore staring at a glue-stick-like contraption thinking, "Apply directly to MY forehead!" The question of whether it actually alleviates pain gets smothered by a combination of tagline bludgeoning and tension headache.

Repetition is what makes fake news work, too, as researchers at Central Washington University pointed out in a study way back in 2012 before the term was everywhere. It's also a staple of political propaganda. It's why flacks feed politicians and CEOs sound bites that they can say over and over again. Not to go all Godwin's Law on you, but even Adolf Hitler knew about the technique. "Slogans should be persistently repeated until the very last individual has come to grasp the idea," he wrote in Mein Kampf.

The effect works because when people attempt to assess truth they rely on two things: whether the information jibes with their understanding, and whether it feels familiar. The first condition is logical: People compare new information with what they already know to be true and consider the credibility of both sources. But researchers have found that familiarity can trump rationality—so much so that hearing over and over again that a certain fact is wrong can have a paradoxical effect. It's so familiar that it starts to feel right.

"When you see the fact for the second time it's much easer to process—you read it more quickly, you understand it more fluently," says Vanderbilt University psychologist Lisa Fazio. "Our brain interprets that fluency as a signal for something being true"—Whether it's true or not. In other words, rationality can be hard. It takes work. Your busy brain is often more comfortable running on feeling.

You are busy, too, so let me get back to Trump's latest executive orders, which are mostly symbolic. They certify that the government will do what it can to keep law enforcement officers safe. They contain vague language that civil rights advocates worry could lead to the criminalization of protest. But while perhaps unnecessary, the orders are hardly pointless—they reinforce the idea that America is unsafe, that law enforcement officers are at risk, that the country needs a strong "law and order" president. Data be damned.

As with any cognitive bias, the best way not to fall prey to it is to know it exists. If you read something that just feels right, but you don't know why, take notice. Look into it. Check the data. If that sounds like too much work, well, facts are fun.

Facts are fun.

Mar 14, 2017

Nine in 10 people would electrocute others if ordered, rerun of infamous Milgram Experiment shows 

The Milgram Experiment being conducted in the 1960s
The Milgram Experiment being conducted in the 1960s
Sarah Knapton, science editor 
Telegraph Science
March 14, 2017


A notorious experiment in the 1960s to find out if ordinary people were prepared to inflict pain if ordered to do so by an authority figure has reached an even more sinister conclusion.

Despite the lessons of history, nine in 10 would electrocute their peers even if they were screaming in agony, simply because they were told to do so.

When the original study was conducted by American psychologist Stanley Milgram, from Yale University, only two thirds of people continued all the way up to the maximum 450-volt level.

The experiments were devised to investigate the insistence by the German Nazi Adolf Eichmann, during his war crimes trial, that he and his accomplices in the Holocaust, were “just following orders

Fifty years later, the new version of the experiment conducted in Poland has shown that human nature, if anything has got worse.

Most people say they would not inflict pain on others but are happy to do so if ordered to by an authority figure

This time, 80 participants were recruited, including women as well as men, and 90 per cent were willing to inflict the highest shock level of 450 volts to a complicit "learner" screaming in agony.

Social psychologist Dr Tomasz Grzyb, from the SWPS University of Social Sciences and Humanities in Poland, said: "Upon learning about Milgram's experiments, a vast majority of people claim that 'I would never behave in such a manner'.

"Our study has, yet again, illustrated the tremendous power of the situation the subjects are confronted with and how easily they can agree to things which they find unpleasant."

The participants, aged 18 to 69, were shown an electric generator which was demonstrated by administering a mild shock of 45 volts.

Volunteers were given a series of 10 levers to press, each appearing to send a successively higher shock to the learner - out of sight in a neighbouring room - via electrodes attached to the wrist.

In reality, no electric shocks were delivered, and, as in the original experiment, the learner was playing a role.

After pressing lever number two, "successive impulses of electricity " resulted in screams of increasing pain from the learner," the scientists wrote in the journal Social Psychological and Personality Science.

"These screams were recorded and played back at appropriate moments."

The "teachers" were told they were taking part in research on memory and learning.

Just as in Milgram's experiment, they were spurred on by prompts from the supervising scientist such as "the experiment requires that you continue", "it is absolutely essential that you continue", and "you have no other choice, you must go on".

Mercy was more apparent when the learner was a woman. In this case, the number of participants refusing to carry out the orders of the experimenter was three times higher than when the person receiving the "shocks" was a man.

Dr Grzyb concluded: "Half a century after Milgram's original research into obedience to authority, a striking majority of subjects are still willing to electrocute a helpless individual."

A recent study by St Andrew's University suggested that people were happy to inflict pain on others if they believed it was for the greater good. The researchers looked back through records of the original experiment and found that those who took part were not unhappy with their choice.

http://www.telegraph.co.uk/science/2017/03/14/nine-10-people-would-electrocute-others-ordered-re-run-milgram/

Mar 26, 2016

Fueling Terror: How Extremists Are Made

The psychology of group dynamics goes a long way toward explaining what drives ordinary people toward radicalism

S. Alexander Haslam and Stephen D. Reicher
Scientific America
March 25, 2016

Understanding Co-radicalization

Although we may think of terrorists as sadists and psychopaths, social psychology suggests they are mostly ordinary people, driven by group dynamics to do harm for a cause they believe to be noble and just.Terrorism reconfigures these group dynamics so that extreme leadership seems more appealing to everyone. Just as ISIS feeds off immoderate politicians in the West, for example, so do those immoderate politicians feed off ISIS to draw support for themselves.Having others misperceive or deny a valued identity—an experience we describe as misrecognition—systematically provokes anger and cynicism toward authorities.

The steep and virulent rise of terrorism ranks among the more disturbing trends in the world today. According to the 2015 Global Terrorism Index, terror-related deaths have increased nearly 10-fold since the start of the 21st century, surging from 3,329 in 2000 to 32,685 in 2014. Between 2013 and 2014 alone, they shot up 80 percent. For social psychologists, this escalation prompts a series of urgent questions, just as it does for society as a whole: How can extremist groups treat fellow human beings with such cruelty? Why do their barbaric brands of violence appeal to young people around the globe? Who are their recruits, and what are they thinking when they target innocent lives?

Many people jump to the conclusion that only psychopaths or sadists—individuals entirely different from us—could ever strap on a suicide vest or wield an executioner's sword. But sadly that assumption is flawed. Thanks to classic studies from the 1960s and 1970s, we know that even stable, well-adjusted individuals are capable of inflicting serious harm on human beings with whom they have no grievance whatsoever. Stanley Milgram's oft-cited “obedience to authority” research showed that study volunteers were willing to administer what they believed to be lethal electric shocks to others when asked to do so by a researcher in a lab coat. Fellow psychologist Philip Zimbardo's (in)famous Stanford Prison Experiment revealed that college students assigned to play the part of prison guards would humiliate and abuse other students who were prisoners.

These studies proved that virtually anyone, under the right—or rather the wrong—circumstances, could be led to perpetrate acts of extreme violence. And so it is for terrorists. From a psychological perspective, the majority of adherents to radical groups are not monsters—much as we would like to believe that—no more so than were the everyday Americans participating in Milgram's and Zimbardo's investigations. As anthropologist Scott Atran notes, drawing on his long experience of studying these killers, most are ordinary people. What turns someone into a fanatic, Atran explained in his 2010 book Talking to the Enemy, “is not some inherent personality defect but the person-changing dynamic of the group” to which he or she belongs.

For Milgram and Zimbardo, these group dynamics had to do with conformity—obeying a leader or subscribing to the majority view. During the past half a century, though, our understanding of how people behave both within and among groups has advanced. Recent findings challenge the notion that individuals become zombies in groups or that they can be easily brainwashed by charismatic zealots. These new insights are offering a fresh take on the psychology of would-be terrorists and the experiences that can prime them toward radicalization.

In particular, we are learning that radicalization does not happen in a vacuum but is driven in part by rifts among groups that extremists seek to create, exploit and exacerbate. If you can provoke enough non-Muslims to treat all Muslims with fear and hostility, then those Muslims who previously shunned conflict may begin to feel marginalized and heed the call of the more radical voices among them. Likewise, if you can provoke enough Muslims to treat all Westerners with hostility, then the majority in the West might also start to endorse more confrontational leadership. Although we often think of Islamic extremists and Islamophobes as being diametrically opposed, the two are inextricably intertwined. And this realization means that solutions to the scourge of terror will lie as much with “us” as with “them.”

FOLLOWING THE LEADER

Milgram's and Zimbardo's findings showed that almost anyone couldbecome abusive. If you look closely at their results, though, most participants did not. So what distinguished those who did? The pioneering work of social psychologists Henri Tajfel and John Turner in the 1980s, though unrelated, suggested part of the answer. They argued that a group's behavior and the ultimate influence of its leaders depended critically on two interrelated factors: identification and disidentification. Specifically, for someone to follow a group—possibly to the point of violence—he or she must identify with its members and, at the same time, detach from people outside the group, ceasing to see them as his or her concern.

We confirmed these dynamics in our own work that has revisited Zimbardo's and Milgram's paradigms. Across a number of different studies, we have found consistently that, just as Tajfel and Turner proposed, participants are willing to act in oppressive ways only to the extent that they come to identify with the cause they are being asked to advance—and to disidentify with those they are harming. The more worthwhile they believe the cause to be, the more they justify their acts as regrettable but necessary.

This understanding—that social identity and not pressure to conform governs how far someone will go—resonates with findings about what actually motivates terrorists. In his 2004 book Understanding Terror Networks, forensic psychiatrist Marc Sageman, a former CIA case officer, emphasized that terrorists are generally true believers who know exactly what they are doing. “The mujahedin were enthusiastic killers,” he noted, “not robots simply responding to social pressures or group dynamics.” Sageman did not dismiss the importance of compelling leaders—such as Osama bin Laden and ISIS's Abu Bakr al-Baghdadi—but he suggested that they serve more to provide inspiration than to direct operations, issue commands or pull strings.

Indeed, there is little evidence that masterminds orchestrate acts of terror, notwithstanding the language the media often use when reporting these events. Which brings us to a second recent shift in our thinking about group dynamics: we have observed that when people do come under the influence of authorities, malevolent or otherwise, they do not usually display slavish obedience but instead find unique, individual ways to further the group's agenda. After the Stanford Prison Experiment had concluded, for example, one of the most zealous guards asked one of the prisoners whom he had abused what he would have done in his position. The prisoner replied: “I don't believe I would have been as inventive as you. I don't believe I would have applied as much imagination to what I was doing.... I don't think it would have been such a masterpiece.” Individual terrorists, too, tend to be both autonomous and creative, and the lack of a hierarchical command structure is part of what makes terrorism so hard to counter.

How do terror leaders attract such engaged, innovative followers if they are not giving direct orders? Other discoveries from the past few decades (summarized in our 2011 book, co-authored with Michael J. Platow, The New Psychology of Leadership) highlight the role leaders play in building a sense of shared identity and purpose for a group, helping members to frame their experiences. They empower their followers by establishing a common cause and empower themselves by shaping it. Indeed, Milgram's and Zimbardo's experiments are object lessons in how to create a shared identity and then use it to mobilize people toward destructive ends. Just as they convinced the participants in their studies to inflict harm in the name of scientific progress, so successful leaders need to sell the enterprise they envision for their group as honorable and noble.

Both al Qaeda and ISIS deploy this strategy. A large part of their appeal to sympathizers is that they promote terror for the sake of a better society—one that harks back to the peaceful community that surrounded the prophet Mohammed. Last year University of Arizona journalism professor Shahira Fahmy carried out a systematic analysis of ISIS's propaganda and found that only about 5 percent depicts the kind of brutal violence typically seen on Western screens. The great majority features visions of an “idealistic caliphate,” which would unify all Muslims harmoniously. Moreover, a significant element of ISIS's success—one that makes it more threatening than al Qaeda—lies in the very fact that its leaders lay claim to statehood. In the minds of its acolytes at least, it has the means to try to make this utopian caliphate a reality.

Crucially, however, the credibility and influence of leaders—especially those who promote conflict and violence—depend not only on what they say and do but also on their opponents' behavior. Evidence for this fact emerged after a series of experiments by one of us (Haslam) and Ilka Gleibs of the London School of Economics that looked at how people choose leaders. One of the core findings was that people are more likely to support a bellicose leader if their group faces competition with another group that is behaving belligerently. Republican candidate Donald Trump might have been wise to ponder this before he suggested that all Muslim immigrants are potential enemies who should be barred from entering the U.S. Far from weakening the radicals, such statements provide the grit that gives their cause greater traction. Indeed, after Trump made his declaration, an al Qaeda affiliate reaired it as part of its propaganda offensive.

THE GRAY ZONE

Just as ISIS feeds off immoderate politicians in the West, so those immoderate politicians feed off ISIS to draw support for themselves. This exchange is part of what religion scholar Douglas Pratt of the University of Waikato in New Zealand refers to as co-radicalization. And here lies the real power in terrorism: it can be used to provoke other groups to treat one's own group as dangerous—which helps to consolidate followers around those very leaders who preach greater enmity. Terrorism is not so much about spreading fear as it is about seeding retaliation and further conflict. Senior research fellow Shiraz Maher of the International Center for the Study of Radicalization and Political Violence at King's College London has pointed out how ISIS actively seeks to incite Western countries to react in ways that make it harder for Muslims to feel that they belong in those communities.

In February 2015 the ISIS-run magazine Dabiq carried an editorial entitled “The Extinction of the Grayzone.” Its writers bemoaned the fact that many Muslims did not see the West as their enemy and that many refugees fleeing Syria and Afghanistan actually viewed Western countries as lands of opportunity. They called for an end of the “gray zone” of constructive coexistence and the creation of a world starkly divided between Muslim and non-Muslim, in which everyone either stands with ISIS or with the kuffar (nonbelievers). It also explained the attacks on the headquarters of the French magazineCharlie Hebdo in exactly these terms: “The time had come for another event—magnified by the presence of the Caliphate on the global stage—to further bring division to the world.”

In short, terrorism is all about polarization. It is about reconfiguring intergroup relationships so that extreme leadership appears to offer the most sensible way of engaging with an extreme world. From this vantage, terrorism is the very opposite of mindless destruction. It is a conscious—and effective—strategy for drawing followers into the ambit of confrontational leaders. Thus, when it comes to understanding why radical leaders continue to sponsor terrorism, we need to scrutinize both their actions and our reactions. As editor David Rothkopf wrote in Foreign Policyafter the Paris massacres last November, “overreaction is precisely the wrong response to terrorism. And it's exactly what terrorists want.... It does the work of the terrorists for the terrorists.”

Currently counterterrorism efforts in many countries give little consideration to how our responses may be upping the ante. These initiatives focus only on individuals and presume that radicalization starts when something happens to undermine someone's sense of self and purpose: discrimination, the loss of a parent, bullying, moving, or anything that leaves the person confused, uncertain or alone. Psychologist Erik Erikson noted that youths—still in the process of forming a secure identity—are particularly vulnerable to this kind of derailment [see “Escaping Radicalism,” by Dounia Bouzar, on page 40]. In this state, they become easy prey for radical groups, who claim to offer a supportive community in pursuit of a noble goal.

We have no doubt that this is an important part of the process by which people are drawn into terrorist groups. Plenty of evidence points to the importance of small group ties, and, according to Atran and Sageman, Muslim terrorists are characteristically centered on clusters of close friends and kin. But these loyalties alone cannot adequately address what Sageman himself refers to as “the problem of specificity.” Many groups provide the bonds of fellowship around a shared cause: sporting groups, cultural groups, environmental groups. Even among religious factions—including Muslim groups—the great majority provide community and meaning without promoting violence. So why, specifically, are some people drawn to the few Muslim groups that do preach violent confrontation?

We argue that these groups are offering much more than consolation and support. They also supply narratives that resonate with their recruits and help them make sense of their experiences. And in that case, we need to seriously examine the ideas militant Muslim groups propagate—including the notion that the West is a long-standing enemy that hates all Muslims. Do our “majority” group reactions somehow lend credence to radicalizing voices in the minority Muslim community? Do police, teachers and other prominent figures make young Muslims in the West feel excluded and rejected—such that they come to see the state less as their protector and more as their adversary? If so, how does this change their behavior?

To begin to find out, one of us (Reicher), working with psychologists Leda Blackwood, now at the University of Bath in England, and Nicholas Hopkins of the University of Dundee in Scotland, conducted a series of individual and group interviews at Scottish airports in 2013. As national borders, airports send out clear signals about belonging and identity. We found that most Scots—Muslim and non-Muslim alike—had a clear sense of “coming home” after their travels abroad. Yet many Muslim Scots had the experience of being treated with suspicion at airport security. Why was I pulled aside? Why was I asked all those questions? Why was my bag searched? In the words of one 28-year-old youth worker: “For me to be singled out felt [like], ‘Where am I now?’ I consider Scotland my home. Why am I being stopped in my own house? Why am I being made to feel as the other in my own house?”

We gave the term “misrecognition” to this experience of having others misperceive or deny a valued identity. It systematically provoked anger and cynicism toward authorities. It led these individuals to distance themselves from outwardly British-looking people. After such an experience, one Muslim Scot said he felt that he would look ridiculous if he then continued to advocate trust in the agencies that had humiliated him. In other words, misrecognition can silence those who, having previously felt aligned with the West, might have been best placed to prevent further polarization. To be clear, misrecognition did not instantly turn otherwise moderate people into terrorists or even extremists. Nevertheless, it began to shift the balance of power away from leaders who say, “Work with the authorities; they are your friends,” toward those who might insist, “The authorities are your enemy.”

A CAUTIONARY TALE

We can take this analysis of misrecognition and its consequences a step further. When we adapted Zimbardo's prison study in our own research, we wanted to reexamine what happens when you mix two groups with unequal power. For one thing, we wanted to test some of the more recent theories about how social identity affects group dynamics. For instance, we reasoned that prisoners would identify with their group only if they had no prospect of leaving it. So we first told the volunteers assigned to be prisoners that they might be promoted to be guards if they showed the right qualities. Then, after a single round of promotions, we told them that there would be no more changes. They were stuck where they were.

We have discussed the effects of these manipulations in many publications, but there is one finding we have not written about before—an observation that is especially relevant to our discussion of extremitization. From the outset of the study, one particular prisoner had very clear ambitions to be a future guard. He saw himself as capable of uniting the guards and getting them to work as a team (something with which they were having problems). Other prisoners teased him; they talked of mutiny, which he ignored. Then, during the promotion process, the guards overlooked this prisoner and promoted someone he viewed as weaker and less effective. His claim to guard identity had been publicly rebuffed in a humiliating way.

Almost immediately his demeanor and behavior changed. Previously he was a model inmate who shunned his fellow prisoners, but now he identified strongly with them. He had discouraged the prisoners from undermining the guards' authority, but now he joined in with great enthusiasm. And although he had supported the old order and helped maintain its existence, he began to emerge as a key instigator of a series of subversive acts that ultimately led to the overthrow and destruction of the guards' regime.

His dramatic conversion came after a series of psychological steps that are occurring regularly in our communities today: aspiration to belong, misrecognition, disengagement and disidentification. Outside of our prison experiment, the story goes something like this: Radical minority leaders use violence and hate to provoke majority authorities to institute a culture of surveillance against minority group members. This culture stokes misrecognition, which drives up disidentification and disengagement from the mainstream. And this distancing can make the arguments of the radicals harder to dismiss. Our point is that radical minority voices are not enough to radicalize someone, nor are the individual's own experiences. What is potent, though, is the mix of the two and their ability to reinforce and amplify each other.

The analysis of terrorism we present here is, of course, provisional as we continue to collect evidence. We do not deny that some individual terrorists may indeed have pathological personalities. But terrorism brings together many people who would not ordinarily be inclined to shoot a gun or plant a bomb. And so there can be no question that understanding it calls for a group-level examination—not just of radicals but of the intergroup dynamic that propels their behavior. This context is something we are all a part of, something that we all help to shape. Do we treat minority groups in our communities with suspicion? Do those who represent us question their claims to citizenship? Do we react to terror with calls for counterterror? The good news is that just as our analysis sees us as part of the problem, it also makes us part of the solution.

This article was originally published with the title "Fueling Extremes"

http://www.scientificamerican.com/article/fueling-terror-how-extremists-are-made/?utm_content=30234739&utm_medium=social&utm_source=facebook

Mar 21, 2016

What Motivates Extreme Self-Sacrifice?

What Motivates Extreme Self-Sacrifice?
New work in the field of anthropology says violent extremism isn't really motivated by religion—but by fusion with the group.
HARVEY WHITEHOUSE
Pacific Standard
March 21, 2016
Misrata, Libya, 2011. I am ushered into the boardroom of what was once an oil investment corporation. I am surrounded by youths with Kalashnikovs. On the other side of the table are several of Libya's most respected rebel leaders, foremost among them Salim Jawha, a former colonel in Muammar Gaddafi's army who defected on the first day of the revolution.
In the preceding months, over 1,000 rebels have been killed and many thousands more horrifically injured. Stories of heroism are commonplace. For example, on March 6, Gaddafi's forces—supported by seven tanks and some 25 or so vehicles with mounted machine guns—attempted to re-take the city but were ambushed and overcome by rebels. Despite the imbalance of military hardware and heavy loss of life, the rebels prevailed through astonishing courage and determination.
I'm here because I want to know what motivated thousands of civilians, most of whom had never even held a gun before, to take up arms as part of a popular uprising in which death was far likelier than victory. A more general version of this question has been guiding my research for some years, in my work with a wide variety of military groups ranging from tribal warriors in the rainforests of Papua New Guinea to highly trained soldiers in the British special forces and Royal Marines. One of the themes that continually surfaces in these conversations is that fighters don't put their lives on the line for abstract values like "king and country" or "God, freedom, and democracy." They do it for each other.
When people undergo painful or scary ordeals, those experiences stick with them through life—they change to such a degree that, without those experiences, they really wouldn't be the same people anymore.
At the University of Oxford, I lead an international network of researchers dedicated to understanding what makes bonds so strong that people will fight and die for the group when it is threatened. Our research suggests that one of the most powerful causes of extreme pro-group action is the sharing of self-defining experiences. If so, this has profound implications for the way we should approach conflict resolution and counter-terrorism. Public debate and policymaking has been dominated for years by the view that extreme beliefs are what motivate extreme behaviors. I disagree—but with such a tide of popular opinion against me, I need evidence not only from the laboratory or even from the assault course and training camp, but also from the frontlines. This has brought me to Libya.
Seated beside me is Brian McQuinn, my doctoral student, whose shared interest in the cohesion of armed groups led him to enter the country via Malta on a converted fishing trawler several months earlier. Through his earlier work for the United Nations and the Carter Center, McQuinn acquired formidable skills for establishing rapport with fighters and ex-combatants in troubled regions. I am nevertheless dazzled by his ability not only to spirit me into Libya at this difficult time but then also through the heavily armed fortifications of this rebel stronghold for an audience with Misrata's revolutionary leaders.
Jawha lights a cigarette, then exhales slowly: "When the revolution began, there was no compulsion to join. We just called our friends and asked them: Do you want to die or not? If you want to die, come with us. If not, go home and stay out of harm's way. This is not a time for reflection and discussion. He who wishes to die can accomplish anything. He who does not may go in peace."
It is said that any one of the civilians-turned-fighters in Misrata's ka'tib(battalions) was worth 10 times a professional soldier in any conventional army. Why? According to Jawha, what matters for a revolutionary is the goal, not the job. Goals, he continues, engage you personally, and if you care about them enough, you will do anything in your power to bring them to fruition. Jobs, by contrast, are just a way of paying the bills—you'll do the minimum necessary for a regular paycheck, and that's all.
I ask Jawha for an example of heroism. In a room packed with people who have lost scores of their closest friends in frontline combat, this is probably a tactless request. His steely blue eyes fix me through the smoke and he replies: "The tanks were driving into Tripoli Street and there was a flag on the back of this tank; a kid of maybe 13, he climbed onto this tank as it was moving. Why? Just to remove the green flag and put our flag. The revolutionary flag."
The boy no doubt expected to be killed. Everyone expected him to be killed. Amazingly, he survived. Thousands of other rebels didn't.
In an impromptu memorial to the martyrs of the revolution in Misrata, headshots of martyrs were placed on public display alongside booty seized from Gaddafi and his supporters.
People commonly associate extreme self-sacrifice with Islamist martyrs in the Middle East, Pakistan, and Southeast Asia, but the phenomenon is actually much more ancient and widespread than that. Willingness to walk into the jaws of death for the sake of a group has been far more commonplace in the human past than the current discourse acknowledges. The concept of a suicide attack, in the public imagination, has become so inextricably linked to jihadism that, for many of us who repudiate radical Islamist teachings, it has become practically impossible to imagine extreme self-sacrifice as anything other than an outgroup trait—the anthropological term for something that crazy people in other cultures do, but not something that we could ever engage in.
But is what motivates Islamist martyrdom really so different, psychologically, from what drove young men to sign up in droves 100 years ago to serve as cannon fodder in World War I? How many of us would lay down our lives for our closest family members if there were no other way to save them? I suspect rather a lot of us would do the latter almost as a natural, inescapable expression of the bonds of kinship.
Is it this sort of common psychology that drives self-sacrifice and suicide attacks, or is it—as many opinion leaders would have us believe—an especially dangerous and virulent form of religious dogmatism? The religion narrative may be the easiest to tell, but rigorous empirical study among very different types of groups across the globe suggests something more complicated: a story in which every human being contains the potential for violent self-sacrifice.
To many observers of suicide attacks, it seems obvious that religious dogmatism alone is to blame. The dogmas of Islam are often singled out as examples. On the one side there are public intellectuals, including Sam Harris and the late Christopher Hitchens, who have long argued that Islam is essentially a totalitarian religion that insists on punishing unauthorized deviation from its tenets, not only among followers but among non-Muslims as well. For these commentators, the acts of barbarism carried out by ISIS are as fundamentally rooted in the teachings of Islam as the officially sanctioned beheadings and extreme forms of corporal punishment in nation-states such as Saudi Arabia. In much the same vein, Harris, Hitchens, and others have repeatedly insisted that what motivates suicide attacks is likewise the savage dogmatism of a medieval religious cast of mind—and above all the belief that acts of martyrdom will be rewarded in the afterlife. The central argument among such thinkers is that this theocratic mind-set, a rare but dangerous throwback to the past, is what drives the various forms of terrorism plaguing the world today.
An alternative interpretation comes from the ranks of more moderate forms of Islam and its defenders among the liberal intelligentsia. Liberals often like to say that Islam itself does not promote violence; rather, certain authoritarian states and terrorist organizations manipulate the teachings of Islam to sanction their own brutality, but these are perversions of Muslim doctrine and unrepresentative of the moral majority. Note, however, that apologists seeking to explain extreme actions as the behavior of a putative moral minority nevertheless employ the same logic as Harris and Hitchens. Reza Aslan, for example, distinguishes what he calls the "cosmic war" of al-Qaeda from more prosaic skirmishes that we routinely see among rival groups everywhere. In other words, an extraordinary ideology (in this case, the "cosmic dualism" of radical Islam) is blamed for making people do barbaric things on a more barbaric scale than would be the case without that ideology.
Citizen Soldiers: A band of rebel forces work together as a unit to clear out the pro-Muammar Gaddafi Abu Salim district on the southern side of Tripoli, Libya, in late August of 2011. (Photo: Benjamin Low/Getty Images)
Ironically, both atheists and moderate Muslims cut their arguments from the same cloth: They argue that extreme beliefs are responsible for motivating extreme behavior. The only difference is that liberals and moderates attribute extreme beliefs to a radical minority in Islam, whereas the New Atheists argue that extremism is woven into the very fabric of religious thinking itself.
The idea that beliefs drive behavior is seductive. For Harris and many others it is a simple and inescapable fact. As Harris writes in The End of Faith: "As a man believes, so he will act. Believe that you are a member of a chosen people, awash in the salacious exports of an evil culture that is turning your children away from God, believe that you will be rewarded with an eternity of unimaginable delights by dealing death to these infidels—and flying a plane into a building is only a matter of being asked to do it."
But if Libya's revolution was motivated by belief—e.g., that Gaddafi was an evil dictator who should be overthrown—then why did it take people so long to act? And why did some people fight and die while others ran away? These questions, so apparently simple, are far from easy to answer.
A vast body of evidence from experimental psychology spanning many decades shows that what motivates behavior is often not available to conscious awareness and hardly containable within straightforward, verbally articulable beliefs and arguments. In the case of extreme behavior there is an alternative explanation as to why people will fight to the death or blow themselves up—namely, that they are doing it for each other, for the group. They are not really doing it because of doctrines or ideologies or even in the hope of personal rewards in the afterlife based on those belief systems. True, suicide bombers may have the impression (or even the delusional conviction) that their behavior is driven by religious teachings, but that doesn't make it so. Nevertheless, if religious dogma is not the real motivator, what is?
Several years ago, psychologists, led by William B. Swann at the University of Texas–Austin, discovered a remarkably powerful form of group alignment, one that was capable of motivating extraordinary levels of pro-social commitment, at least when presented with hypothetical scenarios in which self-sacrifice was the only way to save other members of your group. They referred to this extreme form of attachment to the group as identity fusion. Excited by this new research, I went to Texas to find out more. It turned out that, although they had learned quite a bit about the mechanisms underlying fusion, and the optimal way of measuring it, they hadn't yet devised a well-developed theory as to what caused fusion in the first place. This is where I felt I had something to contribute.
As an anthropologist in the field, I had long been studying fusion without a way of measuring it. Still, I recognized the syndrome and had some ideas about its underlying causes. In the rainforests of Papua New Guinea, where my field research began, groups of young men underwent painful rites of initiation into small warrior cults responsible for carrying out daring raids and protecting the community from its enemies. This got me investigating painful or frightening rituals among a wide range of groups. And it became increasingly clear that the most unpleasant rituals were often found in groups that depended most for their survival on their members sticking together in the face of strong temptations to defect. Warfare is the most common example: There's a very strong temptation to run away in the face of enemy attack, so you need strong inducements to stand firm. While you can enforce loyalty to some extent (e.g. by shooting deserters) the more effective fighting units seem to be ones that are motivated by fusion rather than by coercion.
For a number of years now, I have been working with Swann and his group to find out what it is about painful rituals that causes participants to fuse with one another. And we think we've found out. When people undergo painful or scary ordeals, those experiences stick with them through life—they change to such a degree that, without those experiences, they really wouldn't be the same people anymore. This is even truer in the case of painful rituals: To the extent that rituals provoke a search for symbolic meanings, they can be experienced as quite revelatory and life-changing. In basic terms, painful rituals make people reflect more deeply. Such reflections enrich the essential narrative self (the distinctive autobiographical history that makes me, me—as opposed to anyone else). Our surveys with United States military, for example, show that the more war veterans reflect on the horrors of frontline combat, the more fused they are with each other.
"When the revolution began, there was no compulsion to join. We just called our friends and asked them: 'Do you want to die or not?'"
But how do you get from self-defining individual experiences to the social experience of fusion? Our research suggests that, when you believe others have gone through the same self-defining experiences, it makes the boundary between you and others more porous. This is the essence of fusion with a group. Once fused, people start to treat the group as part of themselves—and to feel empowered by that sensation. When you attack the group, it feels, to the fused person, like a personal attack. The urge to defend the group becomes as primal as the urge to defend the self.
All of which leads us to a rather startling hypothesis: that the reason people willingly walk into the jaws of death (e.g. by carrying out suicide attacks) is because they think that in doing so they are defending themselves and their group—which are really the same thing—against an outgroup threat.
The vast majority of Libyans who took up arms in 2011 were ordinary civilians, untrained and unprepared. They knew their chances of survival were poor. Many thousands were killed or suffered devastating injuries. With the blessing of the revolutionary leadership (secured through McQuinn's contacts), we surveyed 179 surviving members of four battalions. The goal of this survey was to measure fusion with various groups: family, members of the battalion, all fighters in the revolution, and those who supported the revolution but didn't participate in it. Roughly half the sample comprised frontline fighters; the other half were providers of logistical support within the battalions (e.g. they drove or repaired ambulances).
The findings reinforced in startling ways what I'd seen in Papua New Guinea: The overwhelming majority of revolutionaries were fused with their families, with their battalions, and with the members of other battalions. In relation to all these groups, no less than 96 percent of all revolutionaries chose the highest possible levels on the fusion scale. Such high levels of fusion are a sure sign that these men were psychologically prepared to fight and die for each other—though of course we hardly needed evidence of that under the circumstances. But we also found that only a very small proportion (less than 1 percent) of all revolutionaries were fused with those supporters of the revolution who didn’t take up arms. In other words, simply being on the same side ideologically (sharing the same beliefs and goals) doesn’t predict fusion. What really seems to matter is having experienced, together, the intense fear and pain of warfare by virtue of being in a revolutionary battalion.
In the Wings: Extravagant art recovered from one of Muammar Gaddafi's houses graces the street of Misrata, Libya, during an uprising in 2011 that saw many thousands of civilians take up arms. (Photo: Harvey Whitehouse)
Now comes a twist. We asked all participants in the survey to say which group they would choose as their primary fusion target if they could choose only one of them. In other words, we used a forced-choice question: They had to determine which group they were the most fused with. And here we found a striking difference between our two samples. Nearly half of frontline fighters chose their battalions over their families as the primary fusion target. By contrast, only 28 percent of those who provided logistical support chose battalion over family. One plausible interpretation of this finding is that frontline fighters were more fused with each other because they had undergone more intense, self-shaping experiences together.
In light of our research, three psychological explanations for extreme self-sacrifice stand out as especially promising. One is that people are willing to risk their lives to defend a group that they are fused with; they may also express ancillary commitment to the group's values or beliefs, but those ideological commitments exert little real power over their behavior. A second possibility is that fusion with a group is the main motivator of extreme self-sacrifice, but beliefs can also have an amplifying effect—basically increasing the power of fusion. A third possibility is that fusion motivates extreme self-sacrifice, but so, too, do beliefs, independently of fusion.
There are, of course, other possibilities, but these three are currently the focus of a major program of empirical research designed to tease apart the effects of fusion and belief. In a recent study, currently being written up, my associates Jonathan A. Lanman and Michael D. Buhrmester demonstrate that fusion beats fundamentalism hands-down in predicting endorsements of self-sacrifice. If that's true, maybe we should be looking for signs of fusion rather than signs of extreme belief in our efforts to overcome the threat of terrorism.
Does all this mean that we should stop blaming religion for suicide attacks? If by "religion" we mean beliefs in particular creeds, orthodoxies, doctrines, and stories, then the answer is yes. We should stop blaming those things until we have much better evidence that they really are the cause of extreme behavior.
Figuring out what makes individuals and groups act in ways conducive or obstructive to human thriving at any level (local, national, global, or whatever) is an urgent challenge for the policy world. And I mean urgent. We need to figure out the psychology of group alignment and its behavioral outcomes as a matter of the highest priority. The policy community may not appreciate how badly the relevant sciences have been lagging behind on this issue. Consider our study in Libya, for example, where we focused on the question of how fusion with comrades affected the behavior of revolutionaries. You'd be forgiven for thinking that there must have been loads of similar studies looking at the role of social cohesion in all kinds of military groups, from conventional forces to terrorist cells. But you'd be wrong. In fact, the only studies of cohesion in the military that we've been able to track down all concern factors influencing group performance—not willingness to sacrifice one's life for the group.
Unfortunately, "common sense" and received wisdom have been dominating the scientific research agenda when, in fact, we should be letting the science drive our ideas about the causes of extreme behavior. When we next hear politicians or public intellectuals declaring that ideologues, preachers of hate, or religious extremists are solely responsible for motivating suicide attacks and other acts of terrorism, let's not simply accept this as if it were an established fact.
Curtailing freedom of speech may do nothing to prevent or deter intractable conflict in the world. In fact, quite the opposite: A sense of oppression may constitute yet another threat, alongside bombs and bullets, to already embattled groups, fusing them ever more tightly together. If the theories we have been developing and testing are correct, a far more effective way to combat extreme behavior is to take seriously the transformative experiences of groups that feel oppressed or threatened.
It is possible, in principle, to modify the perceived authenticity and sharedness of such experiences through subtle interventions, at both an interpersonal and population level. Not only politicians, educators, and other public figures, but crucially also parents, religious groups, and their leaders could play a role in the process. But that will only be possible if we are all better informed about the real causes of violent intergroup conflict. In the end, science-driven approaches are likely to do far more than censorship and anti-Islamist rhetoric to stem the tide of radicalization, and to defuse those already committed to extreme pro-group action.
http://www.psmag.com/health-and-behavior/what-motivates-extreme-self-sacrifice

Feb 28, 2016

MIND CONTROL: How people become trapped in Cults

Jan 22, 2009


How normal people like you and I can easily become gradually duped over a period of time into becoming deceived by a destructive cult!

This video also reveals another frightening scenario: how a handful of corrupt people in positions of authority in the military could issue unjust unconstitutional orders to their subordinates to carry out acts of violence against their fellow countrymen, citizens who are guilty of nothing more than exercising their God-given Constitutional rights to keep and bear arms and protect their lands and their homes and the lives of themselves and their families!











https://youtu.be/8aw_5cmCwoc

Apr 10, 2013

The 21 Principles of Persuasion - Forbes


Jason Nazar
3/26/2013

How is it that certain people are so incredibly persuasive? Can we all harness those skills?  After  studying the most influential political, social, business and religious leaders, and trying countless techniques out myself, these are the 21 critical lessons I’ve identified to persuading people. This is an overview from a talk I’ve been giving to thousands of entrepreneurs for a few years now on “How to Persuade People.” More detailed examples are explained in the links below.

THE BASICS

1. Persuasion is not Manipulation - Manipulation is coercion through force to get someone to do something that is not in their own interest.  Persuasion is the art of getting people to do things that are in their own best interest that also benefit you.

2. Persuade the Persuadable -  Everyone can be persuaded, given the right timing and context, but not necessarily in the short term.  Political campaigns focus their time and money on a small set of swing voters who decide elections.  The first step of persuasion is always to identify those people that at a given time are persuadable to your point of view and focus your energy and attention on them.

3. Context and Timing - The basics building blocks of persuasion are context and timing.  Context creates a relative standard of what’s acceptable.  For example the Stanford Prisoner Experiment proved that overachieving students could be molded into dictatorial prison guards.  Timing dictates what we want from others and life.  We chose to marry a different type of person than we date when we’re younger, because what we want changes.

4. You have to be Interested to be Persuaded  -  You can never persuade somebody who’s not interested in what you’re saying.  We are all most interested in ourselves, and spend most of our time thinking about either money, love or health.  The first art of persuasion is learning how to consistently talk to people about them; if you do that then you’ll always have their captive attention.

GENERAL RULES

5.  Reciprocity Compels  –  When I do something for you, you feel compelled to do something for me.  It is part of our evolutionary DNA to help each other out to survive as a species.  More importantly, you can leverage reciprocity disproportionately in your favor.   By providing small gestures of consideration to others, you can ask for more back in return which others will happily provide.   (TIP: read  ”Influence” by Robert Cialdini)

6.  Persistence Pays - The person who is willing to keep asking for what they want, and keeps demonstrating value, is ultimately the most persuasive.  The way that so many historical figures have ultimately persuaded masses of people is by staying persistent in their endeavors and message.  Consider Abraham Lincoln, who lost his mother, three sons, a sister, his girlfriend,  failed in business and lost eight separate elections before he was elected president of the United States.

7.  Compliment Sincerely  - We are all so positively affected by compliments, and we’re more apt to trust people for whom we have good feelings.  Try complimenting people sincerely and often for things they aren’t typically complimented for, it’s the easiest thing you can do to persuade others that doesn’t cost anything but a moment of thought.

8.  Set Expectations - Much of persuasion is managing other’s expectations to trust in your judgment.  The CEO who promises a 20% increase in sales and delivers a 30% increase is rewarded, while the same CEO who promises a 40%  increase and delivers 35% is punished. Persuasion is simply about understanding and over-delivering on other’s expectations.

9.  Don’t Assume   - Don’t ever assume what someone needs, always offer your value.  In sales we’ll often hold back from offering our products/services because we assume others don’t have the money or interest.  Don’t assume what others might want or not want, offer what you can provide and leave the choice to them.

10.  Create Scarcity  – Besides the necessities to survive, almost everything has value on a relative scale.  We want things because other people want these things.  If you want somebody to want what you have, you have to make that object scarce, even if that object is yourself.

11.  Create Urgency  –  You have to be able to instill a sense of urgency in people to want to act right away. If we’re not motivated enough to want something right now, it’s unlikely we’ll find that motivation in the future.  We have to persuade people in the present, and urgency is our most valuable card to play.

12.  Images Matter  – What we see is more potent that what we hear.  It may be why pharma companies are now so forthcoming with the potentially horrible side effects of their drugs, when set to a background of folks enjoying a sunset in Hawaii. Perfect your first impressions.  And master the ability to paint an image for others, in their minds eye, of a future experience you can provide for them.

13.  Truth-Tell  – Sometimes the most effective way to persuade somebody, is by telling them the things about themselves that nobody else is willing to say.  Facing the hard truths are the most piercing, meaningful events that happen in our lives.  Truth-tell without judgement or agenda, and you’ll often find others’ responses quite surprising.

14.  Build Rapport - We like people who we are like.  This extends beyond our conscious decisions to our unconscious behaviors.  By Mirroring and Matching others habitual behaviors (body language, cadence, language patterns, etc.) you can build a sense of rapport where people feel more comfortable with you and become more open to your suggestions.


PERSONAL SKILLS

15.  Behavioral Flexibility - It’s the person with the most flexibility, not necessarily the most power, who’s in control.  Children are often so persuasive because they’re wiling to go through a litany of behaviors to get what they want (pouting, crying, bargaining, pleading, charming), while parents are stuck with the single response of “No.”  The larger your repertoire of behaviors, the more persuasive you’ll be.

16.  Learn to Transfer Energy - Some people drain us of our energy, while others infuse us with it.  The most persuasive people know how to transfer their energy to others, to motivate and invigorate them.  Sometimes it’s as straightforward as eye contact, physical touch, laughter, excitement in verbal responses, or even just active listening.

17.  Communicating Clearly is Key - If you can’t explain your concept or point of view to an 8th grader, such that they could explain it with sufficient clarity to another adult, it’s too complicated.  The art of persuasion lies in simplifying something down to its core, and communicating to others what they really care about.

18.  Being Prepared Gives you the Advantage - Your starting point should always be to know more about the people and situations around you.  Meticulous preparation allows for effective persuasion.  For example, you dramatically improve your odds in a job interview being completely versed in the company’s products, services, and background.

19.  Detach and Stay Calm in Conflict - Nobody is more effective when they are “On Tilt.”  In situations of heightened emotion, you’ll always have the most leverage by staying calm, detached and unemotional.  In conflict, people turn to those in control of their emotions, and trust them in those moments to lead them.

20.  Use Anger Purposefully - Most people are uncomfortable with conflict.  If you’re willing escalate a situation to a heightened level of tension and conflict, in many cases others will back down.  Use this sparingly, and don’t do it from an emotional place or due to a loss of self control.  But do remember, you can use anger purposefully for your advantage.

21.  Confidence and Certainty - There is no quality as compelling, intoxicating and attractive as certainty.  It is the person who has an unbridled sense of certainty that will always be able to persuade others.  If you really believe in what you do, you will always be able to persuade others to do what’s right for them, while getting what you want in return.

This article is available online at: 
http://www.forbes.com/sites/jasonnazar/2013/03/26/the-21-principles-of-persuasion/

Mar 26, 2013

Robert Cialdini's YouTube animation of his criteria for Influence: the Science of Persuasion.

This animated video describes the six universal Principles of Persuasion that have been scientifically proven to make you most effective as reported in Dr. Cialdini’s groundbreaking book, Influence. This video is narrated by Dr. Robert Cialdini and Steve Martin, CMCT (co-author of YES & The Small Big).

About Robert Cialdini:
Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University has spent his entire career researching the science of influence earning him a worldwide reputation as an expert in the fields of persuasion, compliance, and negotiation.

Dr. Cialdini’s books, including Influence: Science & Practice and Influence: The Psychology of Persuasion, are the result of decades of peer-reviewed published research on why people comply with requests. Influence has sold over 3 million copies, is a New York Times Bestseller and has been published in 30 languages.

Because of the world-wide recognition of Dr. Cialdini’s cutting edge scientific research and his ethical business and policy applications, he is frequently regarded as the “Godfather of influence.”