Showing posts with label persuasion. Show all posts
Showing posts with label persuasion. Show all posts

Feb 27, 2022

Victim Derogation in the Cultic Context

Linda Demaine
ICSA Annual Conference: Victim Derogation in the Cultic Context

Linda Demaine

3:00 PM - 3:50 PM

Friday, June 24th


Register

The goal of the present project is to contribute to the existing literature on why victims of cults often are attributed great responsibility for the loss they sustained when they actually exerted little control over the environment in which their loss occurred.


The project investigates the ways in which we as a society tend to conceive of harm and how we treat persons who we conclude have suffered harm. Some harms are generally considered to have a stronger impact on the person than other harms. At times, these conclusions are accurate, yet in other instances the magnitude of the harm is over- or under-estimated. Some harms are more socially accepted than other harms, for example, because they derive from certain sources or happened in particular ways. Some harms are readily visible whereas others are more difficult to identify, rendering the latter more suspect and less accurately assessed even when acknowledged. These and related demarcations are important, because we commonly feel great sympathy toward victims of certain types of harm yet show a propensity to further injure other victims by placing unwarranted blame on them. The latter victims endure not only the original traumatic experience but layered on that an unjustified degree of responsibility for the outcome.


The project considers the underlying psychological bases for the victim derogation phenomenon and explores the degree to which they manifest in the legal system’s willingness to recognize and vindicate different types of harm. It then applies this perspective to help explain victim derogation in the cultic context.



Linda Demaine

Professor of Law

Arizona State University, College of Law

Linda J. Demaine, JD, PhD (social psychology), is Professor of Law and Affiliate Professor of Psychology at Arizona State University. She is founder and director of ASU's Law and Psychology Graduate Program. Before arriving at ASU, Dr. Demaine was a behavioral scientist and policy analyst at RAND, where she led and participated in diverse projects, including an analysis of biotechnology patents and the strategic use of deception and other psychological principles in defense of critical computer networks. Dr. Demaine has held an American Psychological Association Congressional Fellowship, through which she worked with the Senate Judiciary Committee on FBI and DOJ oversight, judicial nominations, and legislation. She has also held an American Psychological Association Science Policy Fellowship, working with the Central Intelligence Agency's Behavioral Sciences Unit on issues involving cross-cultural persuasion. Dr. Demaine's research interests include the empirical analysis of law, legal procedure, and legal decision making; the application of legal and psychological perspectives to social issues; ethical, legal, and social issues deriving from advances in technology; and information campaigns and persuasion.


The International Cultic Studies Association (ICSA) is conducting its 2022 Annual International Conference jointly with Info-Secte/Info-Cult of Montreal.  Conference Theme: Exploring the Needs of People Who Leave Controlling Groups and Environments

Register: https://whova.com/web/icsaa_202207/


Aug 14, 2021

The matter of mind control

Dark Persuasion Joel E. Dimsdale Yale University
Dark Persuasion Joel E. Dimsdale Yale University Press, 2021. 304 pp.
Science 13 Aug 2021:
Vol. 373, Issue 6556, pp. 749


In 1976, Patty Hearst, the granddaughter of American publishing magnate William Randolph Hearst, was found guilty of bank robbery, a crime she committed after enduring a sustained period of time as a captive of the domestic terrorist organization known as the Symbionese Liberation Army. The trial, with its glittering cast of expert witnesses, became a test case for psychological theories of brainwashing.

In addition to commenting on Hearst's intelligence and differentiating her behavior from those typically displayed by malingerers, the experts invoked the experiences of US prisoners of war (POWs) in Korea and the concept of “debility, dependency, and dread” to explain how Hearst, in their view, was not acting of her own free will when she committed the robbery. Instead, they argued, she was in thrall to the coercive persuasion of her captors. The jury was unconvinced and sentenced Hearst to 35 years in prison. However, President Jimmy Carter found the arguments persuasive and commuted her sentence after 22 months, and President Bill Clinton pardoned Hearst in 2001. This trial, and the debates surrounding it, is one of 10 key moments in the history of the idea of brainwashing examined by psychiatrist Joel E. Dimsdale in his new book, Dark Persuasion.

A tragedy close to home inspired Dimsdale to dive deeply into this topic. In 1997, as the comet Hale-Bopp approached, his neighbors committed suicide at the instructions of the leaders of the Heaven's Gate cult, who were convinced that death was necessary to free members of their “bodily vehicles” and allow them to ascend to heaven.

The word “brainwashing” was coined in the early years of the Cold War by journalist Edward Hunter to describe reeducation techniques used for indoctrination in Communist China and was subsequently invoked to make sense of the defection of 21 American POWs to China after the Korean War. Elements of brainwashing can be traced back much further to techniques used in religious conversion and inquisition, but the fear unleashed by the provocative term precipitated the Central Intelligence Agency's infamous MKUltra experiments, which made use of the hallucinogen LSD, sensory deprivation, and electroconvulsive therapy during interrogations to coerce confessions and reached their bleak crescendo with a series of highly abusive experiments carried out on psychiatric patients by McGill University psychiatrist Donald Ewen Cameron between 1957 and 1964.

The book avoids making sweeping claims about the nature of brainwashing, but after a chronological assessment of each case study, Dimsdale presents a comparative table of the key features at play, including techniques such as sleep manipulation, coercion and manipulation, intentional surreptitiousness, and participation in activities not in the subject's best interests. That some combination of these are present in varying degrees in all of the cases cited suggests that Dimsdale sees them as necessary and sufficient criteria for describing brainwashing with some degree of certainty. Some of the 21st-century examples he discusses briefly in the book's last section, including controversies surrounding deep brain stimulation and the rise of social media platforms and conspiracy theories, do not quite fit these criteria, however.

While there is novelty in the synthesis of these case studies, Dark Persuasion does not offer much new material, and Dimsdale has not unearthed any substantial unexpected archival finds or generated new oral histories. However, his account of the Hearst trial is carefully researched and supplemented with a wealth of scientific papers from the time. And while the causal link he suggests between Pavlov's experiments with traumatized dogs and the interrogation and torture processes that led to false confessions in Stalin's show trials is based on slender evidence, Dimsdale's observations about the proximity of these events are nevertheless astute.

But historical rigor is not necessarily the point of this highly readable and compelling book. Dimsdale's goal is to prompt reflection on what he sees as the overlooked reality of coercive persuasion at a broader level and the ever-present threat that it poses to individuals and to society at large—a threat, he warns, that is becoming ever-amplified by new technologies and mass media. In this aim, he succeeds admirably.

https://science.sciencemag.org/content/373/6556/749.full

Sep 7, 2018

opening minds: the secret world of manipulation, undue influence and brainwashing

Opening Minds
Opening Minds by Jon Atack shows how we can be cajoled into accepting unethical, uninvited and often harmful influence. With this book, you can help your friends and your children to resist or escape from manipulation. We live in an age where unethical persuasion is applied every day, all day long, to bypass reasoning through direct appeals to our emotions. Throughout history, people have been unwittingly influenced to act against their own best interests. But today, ever more sophisticated forms of influence are being devised, posing a significant threat to a free and open society. It is persuasion so sinister and subtle that it can derail critical thinking and overwhelm even the most intelligent of people, reducing them into unthinking compliance. Manipulation, undue influence and brainwashing, or whatever we call this exploitive persuasion, challenges the very notion of human rights. The use of it by unscrupulous cults, totalitarian groups and abusive individuals is growing at an alarming rate. But such influence is also used on a daily basis by marketing organizations and Internet promoters, targeting young people and interfering with their reasoning skills. It is surprising to find that undue influence remains a well-kept secret in the media and for the general public. This book shows in detail how we are cajoled into submitting to unethical, uninvited external influence. The knowledge in this book is vital for those who wish to help their loved ones spot, resist and escape from the many traps of undue influence in our world.

ORDER NOW!

Mar 4, 2018

Mind games: Inside the mysterious world of the mentalists

Karan Singh, a mentalist, at his house in Faridabad.(Raj K Raj/HT PHOTO)
Karan Singh, a mentalist.
Meet the ‘mystic men’ who are popularising mentalism as a performing art in India

Manoj Sharma
Hindustan Times
March 3, 2018

A one-on-one conversation with Karan Singh can be quite an unsettling experience. As you speak, sitting before him, he looks intently at you -- too intently, in fact, for your comfort. Singh, a mentalist, is trying to read your expressions and body language to play tricks of the mind on you.

Dressed in jeans and a casual navy blue shirt, sporting a stubble beard and a ponytail, Singh looks every bit like a modern mystic. He asks you to think about things happening in your life, say some numbers and letters aloud in your mind. And, in no time, as if he has peeped into your mind, he tells you your birthday, your ATM pin, your favourite city, even the exact thought that crossed your mind at that instant. In fact, revealing phone password and ATM pin is his signature trick. He says he recently stunned Shah Rukh Khan by disclosing his ATM pin at a New-Year party at Aamir Khan’s house.

“What goes on in your mind comes out in your body language. I don’t have any psychic powers. Mentalism is an acquired skill,” says Singh, 26, one of India’s most famous mentalists. “How you sit, how you rest your feet, how you breathe, how you purse your lips, all give away several intimate facts of your life – including whether you are happy in your marriage, relationship, or job.”

Singh’s house in Faridabad is like a little museum of mentalism: there are posters of Sherlock Holmes, of magician Harry Houdini, mentalist Derren Brown, and one from The Prestige, a mystery movie in which two friends and fellow magicians become bitter enemies. Then there is an assortment of dices, Harry Potter replica wands and dozens of books on psychology and mentalism.

While mentalism as a performing art-- where the mentalist demonstrates highly evolved mental abilities or paranormal effects-- has been quite popular in the West, it is fast picking up in India, all because of a new breed of young, suave mentalists.

They specialise in what they call ‘psychological illusion’, and claim to a blend psychology, hypnosis, suggestion, cold reading, neuro-linguistic programming (NLP), misdirection, and other subtle skills of observation to create the illusion of a sixth sense. “A combination of this enables me to accomplish mind reading, psychokinesis and telepathy,” says Singh, who dropped out of college to pursue his passion for mentalism.

And unlike magicians, who use props and sleight of hand, mentalists use their audience as their props and perform tricks of the mind such as reading your thought, planting a thought in your mind, even making you forget your name.“When we talk to someone, 60 per cent communication happens through body language, 30 per cent through the tone of what we are saying, and only 10 per cent through the actual words being spoken,” says Mohit Rao, 39, who was a marketer before he became a mentalist. “What I do is undertake a journey into your mind using my inherent skills and a range of different sciences as tools.”

Rao performs a show called ‘The Wolf of Dalal Street’ for his corporate clients, which involves mind reading, telepathy, hypnosis, walking on broken glass and predicting the exact closing of Sensex on the day of the event, his signature trick. “Mentalism as an art form is all about mystery, amazement, and unforgettable entertainment. Hypnosis can also be a great tool of meditation and relaxation,” he says.

Rao got interested in mentalism after an interaction with a psychologist in 2010. “He said think of an elephant; then he said don’t think of two elephants, and then he said don’t think of two elephants that are pink in colour. I was stunned by how he was trying to manipulate my thoughts. I was exactly thinking what he was asking me not to think,” says Rao, who did four years of intense research before he performed his first show. “I even took a month’s off from my job to study the sciences behind this art. That one month changed the course of my life,” says Rao. “When I quit my job seven years back to become a mentalist, everyone thought I had lost my mind.”

Nakul Shenoy, 40, another well-known mentalist, says that as a child he was inspired by the comic series, Mandrake The Magician, who hypnotically could make people see what he wanted them to see. “ I consider myself a mystery entertainer,” says Shenoy, who is a member of the Psychic Entertainers Association (USA) and the British Society of Psychic Entertainers (UK).

Mentalism, he says, falls in the larger realm of magic. While magic makes the impossible possible, mentalism makes the improbable happen. “A magician’s skills lie in producing something out of nothing, performing vanishes, transformations, transpositions, or levitations, while a mentalist can read people’s mind through verbal manipulation, demonstrate telepathy or clairvoyance,” says Shenoy, who was a ‘user experience’ (UX) professional before

he became a mentalist. “It is not about rare powers, it is about acquiring and honing skills. During my shows, I keep telling people to be wary of godmen, who use nothing but mentalism and magic to fool people.”

Shenoy’s signature tricks include reading people’s minds, demonstrating ‘superhuman memory’ and predicting people’s choices and actions on stage. “My most favourite effects are those where I am performing direct mind reading of my volunteers on stage,” he says.

Singh, who also studied theatre in London, does both theatre and corporate shows. In 2016, he performed Merchant of Menace, a theatre show in eight cities across the country. “In the entire 90-minute show, the audience are the actors—they come on stage, become part of the story of my childhood. I play tricks of memory and hypnosis on them.”

Most of these mentalists are psychology buffs, master communicators and performers, have delivered several Tedx talks, and derive their inspiration from the likes of Derren Brown and Robert Beno Cialdini, a social psychologist. Interestingly, all of them are avid fans of The Mentalist, an American TV series in which Patrick Jane (played by Simon Baker), an independent consultant for the California Bureau of Investigation (CBI), uses his skills from his former career as a successful, yet admittedly fraudulent, psychic to help a team of CBI agents to solve murders.

“I have watched the series many times over, and I love it. In fact, I get some good ideas from it for my shows,” says Singh, laughing. But given the opportunity, would he want to play Patrick Jane in real life? “When I watched it the first time in my teens, I was fascinated and wondered if I too could do it for a living. But mentalism is not an exact science and I can be wrong. So, I would not want to venture into crime investigation,” he says.

Mentalists are much sought after by corporate house these days not just as entertainers, but also as a friend, philosopher and guide for their employees. Their shows, mentalists say, combine knowledge sharing while delivering shock and awe. In Shenoy’s words, “It is cerebral entertainment”. To tell you why, he cites the books of Robert Cialdini, who has written many bestselling books such as Influence: ‘Psychology of Persuasion’, and more recently, ‘Pre-Suasion: A Revolutionary Way to Influence and Persuade’.

“His work brings out how we are persuaded and can persuade others through words and actions. I also deliver lessons in the art of persuasion in a way which is interactive, amazing and entertaining,” he says. “Companies expect us to both enthral and motivate their employees. My shows are structured to have a story with a message. The idea is to help them understand and unleash the power of the mind.”

A mentalist, Shenoy says, has to worry about both the mechanics and performance part of his show. “Performance is very important. When I do a show, I have to be very careful about who I invite on the stage. It is not a random selection, despite it appearing to be. These people are carefully chosen”.

So do mentalists have to practice their craft? “Yes. They have to keep innovating and coming up with new tricks, otherwise they run the risk of being irrelevant,” Singh says. “I make videos of my families and friends to study similarities and differences in their expressions. Body language, as I said, is the key to figuring out what is going on in someone’s mind.”

https://www.hindustantimes.com/delhi-news/mind-games-inside-the-mysterious-world-of-the-mentalists/story-6cF32o8WGvlM2Ad5PLdfRL.html

Oct 1, 2017

What to Make A Lie Seem True? Say it Again, and Again, and Again

What to Make A Lie Seem True? Say it Again, and Again, and Again
EMILY DREYFUSS
Wired
February 11, 2017

YOU ONLY USE 10 percent of your brain. Eating carrots improves your eyesight. Vitamin C cures the common cold. Crime in the United States is at an all-time high.

None of those things are true.

But the facts don't actually matter: People repeat them so often that you believe them. Welcome to the “illusory truth effect,” a glitch in the human psyche that equates repetition with truth. Marketers and politicians are masters of manipulating this particular cognitive bias—which perhaps you have become more familiar with lately.

President Trump is a "great businessman," he says over and over again. Some evidence suggests that might not be true. Or look at just this week, when the president signed three executive orders designed to stop what he describes—over and over again—as high levels of violence against law enforcement in America. Sounds important, right? But such crimes are at their lowest rates in decades, as are most violent crimes in the US. Not exactly, as the president would have it, "American carnage."

The effect is more powerful when people are tired or distracted by other information. So ... 2017, basically.


"President Trump intends to build task forces to investigate and stop national trends that don’t exist," says Jeffery Robinson, deputy legal director of the American Civil Liberties Union. He's right that the trends aren't real, of course. But some number of people still believe it. Every time the president tweets or says something untrue, fact-checkers race to point out the falsehood—to little effect. A Pew Research poll last fall found 57 percent of presidential election voters believed crime across the US had gotten worse since 2008, despite FBI data showing it had fallen by about 20 percent.

So what's going on here? "Repetition makes things seem more plausible," says Lynn Hasher, a psychologist at the University of Toronto whose research team first noticed the effect in the 1970s. "And the effect is likely more powerful when people are tired or distracted by other information." So ... 2017, basically.

Brain Feels


Remember those "Head On! Apply Directly to the Forehead!" commercials? That's the illusory truth effect in action. The ads repeated the phrase so much so that people found themselves at the drugstore staring at a glue-stick-like contraption thinking, "Apply directly to MY forehead!" The question of whether it actually alleviates pain gets smothered by a combination of tagline bludgeoning and tension headache.

Repetition is what makes fake news work, too, as researchers at Central Washington University pointed out in a study way back in 2012 before the term was everywhere. It's also a staple of political propaganda. It's why flacks feed politicians and CEOs sound bites that they can say over and over again. Not to go all Godwin's Law on you, but even Adolf Hitler knew about the technique. "Slogans should be persistently repeated until the very last individual has come to grasp the idea," he wrote in Mein Kampf.

The effect works because when people attempt to assess truth they rely on two things: whether the information jibes with their understanding, and whether it feels familiar. The first condition is logical: People compare new information with what they already know to be true and consider the credibility of both sources. But researchers have found that familiarity can trump rationality—so much so that hearing over and over again that a certain fact is wrong can have a paradoxical effect. It's so familiar that it starts to feel right.

"When you see the fact for the second time it's much easer to process—you read it more quickly, you understand it more fluently," says Vanderbilt University psychologist Lisa Fazio. "Our brain interprets that fluency as a signal for something being true"—Whether it's true or not. In other words, rationality can be hard. It takes work. Your busy brain is often more comfortable running on feeling.

You are busy, too, so let me get back to Trump's latest executive orders, which are mostly symbolic. They certify that the government will do what it can to keep law enforcement officers safe. They contain vague language that civil rights advocates worry could lead to the criminalization of protest. But while perhaps unnecessary, the orders are hardly pointless—they reinforce the idea that America is unsafe, that law enforcement officers are at risk, that the country needs a strong "law and order" president. Data be damned.

As with any cognitive bias, the best way not to fall prey to it is to know it exists. If you read something that just feels right, but you don't know why, take notice. Look into it. Check the data. If that sounds like too much work, well, facts are fun.

Facts are fun.

Sep 25, 2017

Why Hard Facts Aren't Enough to Alter Our Beliefs

If we want to affect the behaviors and beliefs of the person in front of us, we need to understand what goes on inside their head.

Tali Sharot
NBC
June 25, 2017

People love propagating information and sharing opinions. You can see this online: every single day, four million new blogs are written, eighty million new Insta­gram photos are uploaded, and 616 million new tweets are released into cyberspace. It appears the opportunity to impart your knowledge to others is internally rewarding. A study conducted at Harvard University found that people were willing to forgo money so that their opinions would be broadcast to others. We are not talk­ing about well-crafted insights here. These were people’s opin­ions regarding mundane issues, like whether coffee is better than tea. A brain ­imaging scan showed that when people received the opportunity to communi­cate their opinions to others, their brain’s reward center was strongly activated. We experience a burst of pleasure when we share our thoughts, and this drives us to communicate. It is a useful feature of our brain, because it ensures that knowledge, experi­ence and ideas do not get buried with the person who first had them, and that as a society we benefit from the products of many minds.

Of course, in order for that to happen, merely sharing is not enough. We need to cause a reaction —what Steve Jobs aptly referred to as making a “dent in the universe.” Each time we share our opinions and knowledge, it is with the intention of impacting others. Here is the problem, though: we approach this task from inside our own heads. When attempting to create impact, we first and foremost consider ourselves. We reflect on what is persuasive to us, our state of mind, our desires and our goals. But if we want to affect the behaviors and beliefs of the person in front of us, we need to understand what goes on inside their head.

A study conducted at Harvard University found that people were willing to forgo money so that their opinions would be broadcast to others.

A study conducted at Harvard University found that people were willing to forgo money so that their opinions would be broadcast to others.

What determines whether you affect the way others think and behave or whether you are ignored? You may assume that numbers and statistics are what you need to change their point of view. As a scientist I certainly used to think so. Good data, coupled with logical thinking – that’s bound to change minds, right? So I set out to test whether information alters people’s beliefs. My colleagues and I conducted dozens of experiments to figure out what causes people to change their decisions, update their beliefs, and rewrite their memories. We peered into people’s brains, recorded bodily responses, and documented behavior.

Well, you can imagine my dismay when I discovered that all these experiments pointed to the fact that people are not driven by facts. While people do adore data, hard facts are not enough to alter beliefs, and they are practically useless for motivating action. Consider cli­mate change: there are mountains of data indicating that humans play a role in warming the globe, yet approximately 50 percent of the world’s population does not believe it.

The problem with an approach that prioritizes information is that it ignores the core of what makes us human; our motives, our fears, our hopes, our desires, our prior beliefs. In fact, the tsunami of information we are receiving today can make us even less sensitive to data because we’ve become accustomed to finding support for absolutely anything we want to believe with a simple click of the mouse. Instead, our desires are what shape our beliefs; our need for agency, our craving to be right, a longing to feel part of a group.

The problem with an approach that prioritizes information is that it ignores the core of what makes us human; our motives, our fears, our hopes, our desires, our prior beliefs.

The problem with an approach that prioritizes information is that it ignores the core of what makes us human; our motives, our fears, our hopes, our desires, our prior beliefs.

So when it comes to getting your message across to others, consider if you can reframe the information you provide such that it taps into people’s basic motives. This does not mean altering the information itself, but rather presenting it in a different frame. For example, research suggests that framing advice to highlight how things can improve is more effective at changing behavior than warnings and threats, because it generates hope in people. Explaining how exercise improves health, for instance, is more likely to get people to the gym than warning them of obesity and related illnesses.

And when it comes to altering how you respond to information, being aware of our biases can help. When you find yourself dismissing information that does not quite fit your world view, take a pause and reevaluate. Could there be merit in this new information, and could you use it to expand your views? Science has shown that waiting just a couple of minutes before making judgments reduces the likelihood that they will be based solely on instinct.

Tali Sharot is an Associate Professor of Cognitive Neuroscience at University College London, director of the Affective Brain Lab and the author most recently ofThe Influential Mind: What the Brain Reveals About Our Power to Change Others(Henry Holt).

https://www.nbcnews.com/better/health/why-hard-facts-aren-t-enough-alter-our-beliefs-ncna803946

Feb 28, 2016

MIND CONTROL: How people become trapped in Cults

Jan 22, 2009


How normal people like you and I can easily become gradually duped over a period of time into becoming deceived by a destructive cult!

This video also reveals another frightening scenario: how a handful of corrupt people in positions of authority in the military could issue unjust unconstitutional orders to their subordinates to carry out acts of violence against their fellow countrymen, citizens who are guilty of nothing more than exercising their God-given Constitutional rights to keep and bear arms and protect their lands and their homes and the lives of themselves and their families!











https://youtu.be/8aw_5cmCwoc

Jan 14, 2016

Derren Brown 'persuades' two ordinary women to push a stranger off a roof

Harry Mount
Daily Mail
January 12, 2016

Would you abduct a baby from a cafe if a stranger told you to? Or push someone off a tower block to their death on a stranger’s orders? Of course not — unless you were part of a haunting social experiment performed by the illusionist Derren Brown.

Last night on Channel 4, in a programme called Pushed To The Edge, Brown did exactly that. Under Brown’s instruction, an actor playing a policeman persuaded a waiter in a cafe to steal a baby from its mother — after telling the waiter the mother was a child abductor.

And then Brown pulled off an even more staggering scam, pulling the strings from behind the scenes — he is too recognisable to fool his victims in the flesh. 

Moment of horror: Laura uses both hands to shove the helpless man off the roof, in Pushed To The Edge

In an elaborate ruse, Brown staged a fake charity auction on four separate occasions and got four strangers — two men and two women — to attend each of them, on the pretexts that they might not only get lucrative contracts to work for the charity but would also get the chance to network with one of its millionaire donors.

Having agreed to attend, the four strangers were desperate to keep in with the (fake) charity head who was dangling the contract in front of them — and, in the hope of remaining on good terms with him, they agreed to commit increasingly wicked crimes. Frighteningly, each stranger was persuaded over the course of just a matter of hours.

First, they were told the millionaire had dropped dead of a heart attack — a disaster for the charity — and this had to be covered up. All four were persuaded by the ‘charity head’ to hide the donor’s body and impersonate him at the auction.

When the ‘charity head’ changed his mind, and accepted the death would have to be disclosed, he insisted it would have to look like an accident, three of the participants — Hannah, Laura and Martin — agreed to kick the ‘dead body’ in the stomach, to produce the bruises that would result from a fall. Only Chris Kingston, 29, refused.

The four were later duped again when they were told the millionaire hadn’t died at all and had, in fact, just fainted. After this, the incensed donor — very much alive — threatened to ensure they were sent to jail.

In separately staged scenarios, Hannah, Laura and Martin — egged on by a small group of (fake) charity workers — physically shoved the man off the roof, seemingly to his death.

Of course, the actor playing the donor was wearing a harness, which the strangers couldn’t see. The actor dangled happily from a safety rope before being rescued.

Again, only Chris Kingston, co-director of a printing and design company, refused to throw him off.

At the end of the programme, Derren revealed himself as the puppeteer of the hoax and the strangers realised with relief — if, indeed, they weren’t in on the act — that they had been victims of a horrible trick.

The experiment was part of Brown’s demonstration of the power of social compliance: following orders because someone in authority tells you it’s the right thing to do.

‘It’s surprisingly easy to pretend to be an authority figure such as a policeman and persuade someone to do something they would never normally do,’ says Brown.

‘Authority can come from a person, a group of like-minded individuals or an ideology,’ he says. ‘It can help keep public order but it can also push people to commit terrible acts. Can social compliance make someone push a living, breathing human being to their death?’

‘Yes’ was the resounding answer — even if, in this case, the actor playing the millionaire was safely attached to a harness.

Authority can come from a person, a group of like-minded individuals or an ideology. It can help keep public order but it can also push people to commit terrible acts

Derren Brown 

Of course, some of us are naturally more susceptible to peer pressure and following orders than others.

‘Social compliance in various forms is something I work with time and again,’ says Julia Bueno, a psychotherapist in North London. ‘People can be deeply affected by group values and a mentality of “keeping up with the Joneses”.

‘The importance of being part of a group has its roots in evolution. It kept us safe. Being ostracised could kill us. It can take a certain amount of courage to step outside our peer group and be an individual.’

To recruit four exceptionally socially compliant volunteers, Derren Brown auditioned 2,000 members of the public. As part of the vetting process, the potential volunteers were called into a room to be interviewed in groups of four: however, three of the four were planted as actors and only one was a genuine volunteer.

A bell then went off and the three actors in the room stood up automatically. The four contestants who were eventually chosen had shown their compliance by immediately following suit and quickly standing up. At the fake charity auction, Brown — concealed behind the scenes — introduced hidden elements to make his impressionable victims even more compliant.

For example, while other guests wore smart suits, the four volunteers were casually dressed, meaning they felt one down compared to their formally dressed companions.

Within seconds, they accepted a deferential role, following orders to fetch drinks and carry bags for their apparent superiors.

Their propensity for deference and social compliance was intensified by a celebrity element: actor David Tennant, singer Robbie Williams and Stephen Fry — all in on the scam — gave video appeals in aid of the fake charity.

Brown also employed the ‘foot-in-the-door technique’ — the principle that, if you do someone a small favour, you’ll then be more inclined to do a bigger one for them, too, because you’ve already accepted a deferential role.

Pretending the vegetarian food had run out, the ‘charity head’ asked the volunteers an awkward favour: to fraudulently tell veggie guests that the sausage rolls on offer were meat-free. From this small lie, the criminality escalated until three of them were prepared to kill.

For all its elaborate brilliance, Brown’s experiment is not new. For half a century, scientists have been proving how law-abiding, apparently moral humans are prepared to inflict pain when instructed to do so.

The most famous experiment was carried out by Stanley Milgram, an American social psychologist who died in 1984.

Milgram was inspired by the 1961 trial of the Nazi war criminal, Adolf Eichmann. Eichmann defended himself in court by saying he was only following orders when he arranged the mass killing of Jews during the Holocaust.

‘The person who, with inner conviction, loathes stealing, killing and assault may find himself performing these acts with relative ease when commanded by authority,’ Milgram wrote in his 1974 book Obedience To Authority.

‘Behaviour that is unthinkable in an individual who is acting on his own may be executed without hesitation when carried out under orders.’

In 1961, Milgram set up his spine-chilling experiment at Yale University in which volunteers were asked by a scientist to inflict electric shocks on a mild-mannered, 47-year-old accountant.

The participants were told the experiment was investigating whether punishment made people better at learning. Every time the accountant got an answer wrong, the volunteers were asked to give him an electric shock.

In reality, no actual shock was inflicted, but the volunteers thought they were delivering shocks ranging from 15 to 450 volts. Each time the accountant got an answer wrong, the volunteers were asked to increase the voltage.

Of 40 volunteers, 26 kept on obeying orders to the end, raising the shock level to the maximum 450 volts, and applying that three times.

In the first version of Milgram’s experiment, the victim was placed in a separate room from the volunteers, where he couldn’t be seen. At 300 volts, though, they could hear him banging on the wall. Most volunteers still raised the voltage to 315 volts — when the banging on the wall ominously stopped.

The nearer the volunteer was to the victim, the less likely he was to give him a shock. When the volunteer was in the same room as the victim — and had to place the victim’s hand on the electric plate — 70 per cent refused to continue.

Milgram’s conclusions were horrifying. He said that, for many people, there were no limits to their obedience.

‘Cries from the victim were inserted; they were not good enough,’ Milgram said. ‘The victim claimed heart trouble; subjects still shocked him on command. The victim pleaded to be let free . . .subjects continued to shock him.’ Milgram tested different groups of volunteers. An all-women group was just as ready to shock the victim, although they reported higher levels of stress than men.

Social compliance in various forms is something I work with time and again. People can be deeply affected by group values and a mentality of 'keeping up with the Joneses'

 Julia Bueno, psychotherapist

In a separate experiment in 1972, scientists at Missouri University and the University of California asked volunteers to give a puppy genuine, but harmless, electric shocks. Half the male volunteers, and all the female volunteers, agreed.

In 2010, a French documentary — Le Jeu De La Mort (The Game of Death) — carried out the Milgram experiment, disguising it as a game show. Of the 80 contestants, 66 gave their victims electric shocks to the highest voltage level. In a related test in 1971, the Stanford Prison Experiment at Stanford University, California, divided volunteers into prisoners and guards.

The experiment was supposed to last a fortnight but it had to be stopped after six days, when the guards grew unbearably brutal to the prisoners.

The scientists concluded the guards were aggressive because they conformed obediently to their role, which was enhanced by wearing a guard’s uniform.

The results of Milgram tests hardly vary across the world. Dr Thomas Blass, a Milgram expert from Maryland University, has determined that 61 per cent of American volunteers have agreed to inflict the maximum voltage; 66 per cent of non-American volunteers have done the same.

It’s hard, then, not to agree with Derren Brown’s conclusions last night. Indeed, the most shocking thing about his experiment is that the results are not shocking at all.

‘It’s like we’re handed other people’s scripts of how to live our lives to achieve their ambitions and beliefs,’ says Brown. ‘But by understanding how we can be manipulated, we can become stronger; we can say no.’

If you are one of the volunteers who took part in last night’s show — or know any of them — we’d love to hear from you. Please email: derrenbrown@dailymail.co.uk

Source: http://www.dailymail.co.uk/news/article-3396758/As-TV-illusionist-Derren-Brown-persuades-two-ordinary-women-push-stranger-roof-talked-MURDER.html?ITO=1490&ns_mchannel=rss&ns_campaign=149

- See more at: http://www.friynds.com/m/news/view/-n-Derren-Brown-039-persuades-039-two-ordinary-women-to-push-a-stranger-off-a-roof-n#sthash.HsvfCIrj.v8nakiK1.dpuf

http://www.friynds.com/m/news/view/-n-Derren-Brown-039-persuades-039-two-ordinary-women-to-push-a-stranger-off-a-roof-n

Sep 12, 2015

The nudge theory and beyond: how people can play with your mind


Mental manipulation can be backed by good intentions – but when used with stealth, it is deceitful and wrong

Nick Chater
The Observer
September 12, 2015

A couple of decades ago, a class of psychology undergraduates played a mean trick on their lecturer. The students on the right side of the room gently nodded, smiled, and looked thoughtful, while those on the left seemed bored and glum. Before long, the unsuspecting lecturer was addressing the “right” students with enthusiasm, with only the odd uncomfortable glance to the rest. On some secret sign, the students changed roles – and the lecturer duly switched to addressing students to the left. Memories are vague on how often the hapless lecturer was pushed to and fro.

The students’ hilarity was no doubt considerable, especially as the trick used one of the key principles they were being taught: that pigeons, rats or lecturers do more of what is rewarded, and less of what is punished. But how did the lecturer feel when the trick was revealed? In his shoes, I imagine myself trying to summon a brave laugh, but feeling pretty dreadful. Even where no malice is intended, the sense of having been manipulated is hurtful indeed.

University lecturers, like the rest of us, can change their behaviour according to the audience.

So what is manipulation, and why do we hate it? I think it is best viewed as behaviour with the purpose of influencing another person, but which works only if that purpose is concealed. For example, the secret planning of the students’ smiles and frowns was crucial to their scheme’s success. It is the secrecy that really outrages us (with a tinge of humiliation, perhaps, because we were taken in). Manipulation is a form of deceit.

I was reminded of this in a recent talk by the guru of persuasion research, Professor Robert Cialdini of Arizona State University. He eloquently summarised the key forces that persuade us, including the principle that we tend to believe people we like. At the end of his talk, he said he had deliberately begun his presentation with a broad smile: to make us like us like him and, crucially, therefore to believe him.

I felt an inward shudder; I’d been manipulated, perhaps all too effectively. Of course, this was manipulation with the best of intentions. The secret of the trick was freely shared to help us understand, and guard against, the power of manipulation. But still the shudder remained.

Cialdini was speaking at the recent Behavioural Exchange conference in London, which brought together many of the world’s most celebrated psychologists, behavioural economists and policy-makers to consider how understanding human behaviour can help make governments work better. This sounds innocuous enough. Whether creating aircraft controls, computer interfaces or smartphones, it is a basic principle of design to work with the grain of the human mind, not against it. Why should it be any different for government policy?

The results of the research are intriguing. We heard from the “Nudge Unit”, a spin-out from the cabinet office, how tiny tweaks in government communication may increase the success rates of ethnic minority applicants to join the police; can help people to take vital medications; or pay their taxes on time. Research by Cornell University’s Brian Wansinck, who spoke at the meeting, shows that, for example, we eat more ice-cream when we have a larger bowl – and still more when wielding a larger spoon.

Could smaller cones persuade us to eat less ice cream?
Could smaller cones persuade us to eat less ice cream? Photograph: Woods Wheatcroft/Corbis
With rising levels of obesity and diabetes, perhaps government should tell manufacturers to produce smaller scoops, smaller bowls, and perhaps smaller ice-cream tubs, packs of sugary or fattening foods, less capacious wine-glasses and smaller bottles of alcoholic drinks. Could these, and many other “nudges”, gently steer us to healthier and happier lives, without resorting to punitive taxes or even outright bans on the offending foodstuffs?

But for many of us here is also a sense of disquiet. Doesn’t putting these psychological insights, however well-meaning, into government policy amount to state manipulation of the people? Yet once we understand the nature of manipulation, the remedy is clear. Avoiding it means avoiding deception: a good, honest, nudge is one that works even when we know we are being nudged, and why. But the spell cast by a bad, manipulative, nudge is broken as soon as its secret is revealed.

Suppose that one of the psychology students had leaked their plan before the lecture. Then the lecturer would have been laughing and his audience feeling foolish as they went through their routine of synchronised, but strangely ineffective, facial expressions. Or suppose Cialdini had announced: “Now I’m going to smile broadly, so that you like me and believe everything I say.” That would surely have been horribly counterproductive.

But which nudges still work, even when they are out in the open? Do we still eat less ice-cream with a bowl labelled “smaller bowls for smaller servings”? The research remains to be done. On the one hand, we might think: “How thoughtful, this is a great way to save myself from over-indulgence.” A good nudge. But we might react with irritation and have an extra helping to “fight back”.

So the upshot is: let’s say no to manipulation – that is, to influence by stealth or deception. This should apply to how governments treat us, and to how we treat each other. (And, just in case any of my students are reading this, it also means no tricks on lecturers.)

http://www.theguardian.com/theobserver/2015/sep/12/nudge-theory-mental-manipulation-wrong

Apr 10, 2013

The 21 Principles of Persuasion - Forbes


Jason Nazar
3/26/2013

How is it that certain people are so incredibly persuasive? Can we all harness those skills?  After  studying the most influential political, social, business and religious leaders, and trying countless techniques out myself, these are the 21 critical lessons I’ve identified to persuading people. This is an overview from a talk I’ve been giving to thousands of entrepreneurs for a few years now on “How to Persuade People.” More detailed examples are explained in the links below.

THE BASICS

1. Persuasion is not Manipulation - Manipulation is coercion through force to get someone to do something that is not in their own interest.  Persuasion is the art of getting people to do things that are in their own best interest that also benefit you.

2. Persuade the Persuadable -  Everyone can be persuaded, given the right timing and context, but not necessarily in the short term.  Political campaigns focus their time and money on a small set of swing voters who decide elections.  The first step of persuasion is always to identify those people that at a given time are persuadable to your point of view and focus your energy and attention on them.

3. Context and Timing - The basics building blocks of persuasion are context and timing.  Context creates a relative standard of what’s acceptable.  For example the Stanford Prisoner Experiment proved that overachieving students could be molded into dictatorial prison guards.  Timing dictates what we want from others and life.  We chose to marry a different type of person than we date when we’re younger, because what we want changes.

4. You have to be Interested to be Persuaded  -  You can never persuade somebody who’s not interested in what you’re saying.  We are all most interested in ourselves, and spend most of our time thinking about either money, love or health.  The first art of persuasion is learning how to consistently talk to people about them; if you do that then you’ll always have their captive attention.

GENERAL RULES

5.  Reciprocity Compels  –  When I do something for you, you feel compelled to do something for me.  It is part of our evolutionary DNA to help each other out to survive as a species.  More importantly, you can leverage reciprocity disproportionately in your favor.   By providing small gestures of consideration to others, you can ask for more back in return which others will happily provide.   (TIP: read  ”Influence” by Robert Cialdini)

6.  Persistence Pays - The person who is willing to keep asking for what they want, and keeps demonstrating value, is ultimately the most persuasive.  The way that so many historical figures have ultimately persuaded masses of people is by staying persistent in their endeavors and message.  Consider Abraham Lincoln, who lost his mother, three sons, a sister, his girlfriend,  failed in business and lost eight separate elections before he was elected president of the United States.

7.  Compliment Sincerely  - We are all so positively affected by compliments, and we’re more apt to trust people for whom we have good feelings.  Try complimenting people sincerely and often for things they aren’t typically complimented for, it’s the easiest thing you can do to persuade others that doesn’t cost anything but a moment of thought.

8.  Set Expectations - Much of persuasion is managing other’s expectations to trust in your judgment.  The CEO who promises a 20% increase in sales and delivers a 30% increase is rewarded, while the same CEO who promises a 40%  increase and delivers 35% is punished. Persuasion is simply about understanding and over-delivering on other’s expectations.

9.  Don’t Assume   - Don’t ever assume what someone needs, always offer your value.  In sales we’ll often hold back from offering our products/services because we assume others don’t have the money or interest.  Don’t assume what others might want or not want, offer what you can provide and leave the choice to them.

10.  Create Scarcity  – Besides the necessities to survive, almost everything has value on a relative scale.  We want things because other people want these things.  If you want somebody to want what you have, you have to make that object scarce, even if that object is yourself.

11.  Create Urgency  –  You have to be able to instill a sense of urgency in people to want to act right away. If we’re not motivated enough to want something right now, it’s unlikely we’ll find that motivation in the future.  We have to persuade people in the present, and urgency is our most valuable card to play.

12.  Images Matter  – What we see is more potent that what we hear.  It may be why pharma companies are now so forthcoming with the potentially horrible side effects of their drugs, when set to a background of folks enjoying a sunset in Hawaii. Perfect your first impressions.  And master the ability to paint an image for others, in their minds eye, of a future experience you can provide for them.

13.  Truth-Tell  – Sometimes the most effective way to persuade somebody, is by telling them the things about themselves that nobody else is willing to say.  Facing the hard truths are the most piercing, meaningful events that happen in our lives.  Truth-tell without judgement or agenda, and you’ll often find others’ responses quite surprising.

14.  Build Rapport - We like people who we are like.  This extends beyond our conscious decisions to our unconscious behaviors.  By Mirroring and Matching others habitual behaviors (body language, cadence, language patterns, etc.) you can build a sense of rapport where people feel more comfortable with you and become more open to your suggestions.


PERSONAL SKILLS

15.  Behavioral Flexibility - It’s the person with the most flexibility, not necessarily the most power, who’s in control.  Children are often so persuasive because they’re wiling to go through a litany of behaviors to get what they want (pouting, crying, bargaining, pleading, charming), while parents are stuck with the single response of “No.”  The larger your repertoire of behaviors, the more persuasive you’ll be.

16.  Learn to Transfer Energy - Some people drain us of our energy, while others infuse us with it.  The most persuasive people know how to transfer their energy to others, to motivate and invigorate them.  Sometimes it’s as straightforward as eye contact, physical touch, laughter, excitement in verbal responses, or even just active listening.

17.  Communicating Clearly is Key - If you can’t explain your concept or point of view to an 8th grader, such that they could explain it with sufficient clarity to another adult, it’s too complicated.  The art of persuasion lies in simplifying something down to its core, and communicating to others what they really care about.

18.  Being Prepared Gives you the Advantage - Your starting point should always be to know more about the people and situations around you.  Meticulous preparation allows for effective persuasion.  For example, you dramatically improve your odds in a job interview being completely versed in the company’s products, services, and background.

19.  Detach and Stay Calm in Conflict - Nobody is more effective when they are “On Tilt.”  In situations of heightened emotion, you’ll always have the most leverage by staying calm, detached and unemotional.  In conflict, people turn to those in control of their emotions, and trust them in those moments to lead them.

20.  Use Anger Purposefully - Most people are uncomfortable with conflict.  If you’re willing escalate a situation to a heightened level of tension and conflict, in many cases others will back down.  Use this sparingly, and don’t do it from an emotional place or due to a loss of self control.  But do remember, you can use anger purposefully for your advantage.

21.  Confidence and Certainty - There is no quality as compelling, intoxicating and attractive as certainty.  It is the person who has an unbridled sense of certainty that will always be able to persuade others.  If you really believe in what you do, you will always be able to persuade others to do what’s right for them, while getting what you want in return.

This article is available online at: 
http://www.forbes.com/sites/jasonnazar/2013/03/26/the-21-principles-of-persuasion/

Mar 26, 2013

Robert Cialdini's YouTube animation of his criteria for Influence: the Science of Persuasion.

This animated video describes the six universal Principles of Persuasion that have been scientifically proven to make you most effective as reported in Dr. Cialdini’s groundbreaking book, Influence. This video is narrated by Dr. Robert Cialdini and Steve Martin, CMCT (co-author of YES & The Small Big).

About Robert Cialdini:
Dr. Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University has spent his entire career researching the science of influence earning him a worldwide reputation as an expert in the fields of persuasion, compliance, and negotiation.

Dr. Cialdini’s books, including Influence: Science & Practice and Influence: The Psychology of Persuasion, are the result of decades of peer-reviewed published research on why people comply with requests. Influence has sold over 3 million copies, is a New York Times Bestseller and has been published in 30 languages.

Because of the world-wide recognition of Dr. Cialdini’s cutting edge scientific research and his ethical business and policy applications, he is frequently regarded as the “Godfather of influence.”