The Lucifer Effect Summary - Philip Zimbardo

No better time than now to start learning! Start managing yout time effectively. SUBSCRIPTION AT 30% OFF

Limited offer

712 reads ·  0 average rating ·  0 reviews

The Lucifer Effect

The Lucifer Effect Summary
Psychology

This microbook is a summary/original review based on the book: The Lucifer Effect: Understanding How Good People Turn Evil

Available for: Read online, read in our mobile apps for iPhone/Android and send in PDF/EPUB/MOBI to Amazon Kindle.

ISBN: 0812974441

Also available in audiobook

Summary

The basis for the award-winning 2015 Hollywood movie “The Stanford Prison Experiment,” and the starting point for countless debates on the nature of evil, “The Lucifer Effect,” by Philip Zimbardo, deals with one of the fundamental questions of moral philosophy: “What makes good people do bad things?”

So, get ready to hear at least a few of “the myriad reasons why we are all susceptible to the lure of ‘the dark side,’” and prepare to get to know yourself as you never have – inner demons and all.

The banality of evil

In 1963, social philosopher Hannah Arendt published a book nowadays deemed one of the nonfiction classics of the 20th century. Titled “Eichmann in Jerusalem,” the book provided a detailed analysis of the war crimes trial of Adolf Eichmann, one of the major organizers of the Holocaust.

What Arendt found immensely strange in Eichmann was the fact that this man, though personally responsible for the death of millions of Jews, didn’t show some insane amount of hatred for them nor exemplified deep anti-Semitic indoctrination of any kind. As she wrote, “he personally never had anything against Jews.” Moreover, he was evidently neither a monster with exceptional intelligence nor a low-IQ psychopath – but a worryingly “normal” individual. “More normal, at any rate, than I am after having examined him,” one of Eichman’s psychiatrists is said to have exclaimed, with another one finding his whole psychological outlook and attitude toward life “not only normal but most desirable.”

But this, his very normality – Arendt famously concluded – was precisely the problem with Eichmann. Just like all people, he was nothing but a “joiner,” someone who could only define himself through his contributions to something greater than him, someone who wanted to feel less responsible for his own actions by accepting someone else’s directives. “From the viewpoint of our legal institutions and our moral standards of judgment,” wrote Arendt, “this normality was much more terrifying than all the atrocities put together, for it implied… that this new type of criminal… commits his crimes under circumstances that make it well-nigh impossible for him to know or feel that he is doing wrong. It was as though in those last minutes [of Eichmann’s life] he was summing up the lesson that this long course in human wickedness had taught us – the lesson of the fearsome, word-and-thought-defying banality of evil.”

The Milgram experiment and the power of situational forces

In the words of American social psychologist Stanley Milgram, Arendt “became the object of considerable scorn, even calumny” after the publication of “Eichmann in Jerusalem.” Neither the intellectual world nor the general public wanted to believe that people like Eichmann – or Stalin or Hitler, for that matter – are like them. Normal people, critics said, would never mastermind (or even take participation in) the organized death of millions; only sadists would.

Milgram, however, knew better – not the least because the very same year that Arendt published “Eichmann in Jerusalem,” he presented to the world the results of one of the most well-known and most frightening social experiments ever conducted. 

Eichmann had barely been captured when Milgram and his collaborators recruited a group of people under the false premise that they would be taking part in a memory-improvement study. Then, the participants were told that their job consisted in boosting the memory skills of a few “learners” – played by actors hidden from view in another room – by inducing them with painful electric shocks through the wall. In reality, though, the electroshock generator supposedly operated by the participants was connected to a tape recorder in the other room that automatically played prerecorded sounds for each shock level. Be that as it may, each time the “learner” would make a mistake, the participant was supposed to “punish” them with an electric shock, the intensity of which was increased for every next error.

Alarmingly and unexpectedly, almost all of the participants continued increasing the shock level even after feigned pleas from the highly distressed “learners.” What’s more: approximately 2 out of 3 partakers gave their learners the maximum (and life-threatening) electric shock of 450 volts! True, some of them were initially reluctant, but it took no more than a little nudge from the white-coated social psychologist making notes next to them to change their mind. Just like Eichmann, they were too normal not to obey the directives of their supposedly more knowledgeable superior.

The Stanford Prison Experiment

It gets even worse. Just a few years after Arendt’s uncomfortable conclusion and Milgram’s terrifying study, an even more notorious social experiment was conducted in the United States, known ever since as the Stanford Prison Experiment – SPE, for short. The SPE became immediately so infamous that this book, “The Lucifer Effect,” is the first detailed, written account of the events surrounding it – though published more than three decades after the original experiment. You don’t have to guess: its author, Philip George Zimbardo, was the study’s principal investigator and the man who conceived the whole thing.

Funded by the U.S. Office of Naval Research, the SPE was supposed to uncover how normal people would react in a simulated prison environment. To that end, an advertisement was placed in a magazine asking for paid volunteers for a “psychological study of prison life.” Attracted by the relatively nice sum ($15 a day), about 70 college students answered the advertisement. Zimbardo and his colleagues selected 24 based on their physical and mental health; the idea was to find the most normal, the most ordinary among them.

Afterward, the 24 selected applicants were randomly divided into two equal groups: 12 were given the role of guards, and the remaining 12 were assigned the part of prisoners in a mock prison environment (located in the basement of a campus building) with Zimbardo acting as the superintendent. Even though the guards were ordered not to physically abuse the prisoners, they did have to subject them to some humiliations and indignities to simulate the conditions of a real-life prison. Also, each prisoner was made to carry a chain padlocked around one ankle, and forced to wear a “dress;” the guards, of course, were properly uniformed as well.

Zimbardo’s objective was to quickly create an “atmosphere of oppression,” but even he didn’t expect role-playing prisoners staging a rebellion within only two days of the experiment. This enraged the guards who then quickly responded by devising a carrot-or-stick management system that included not only a few authoritarian measures, but also severe psychological torture. Some prisoners – three, to be exact – couldn’t bear this and had to be released within just four days. Others passively accepted their role and, when requested from the guards, started actively harassing their inmates to be more fairly treated. As the guards grew more and more brutal, the prisoners became more and more miserable. 

On the sixth day of the experiment, after an outside observer – with absolute terror and alarm – witnessed some of the videos, Zimbardo decided to put an end to the experiment. His conclusions were – for lack of a stronger phrase – absolutely shocking. Allow us to go over them.

The situational forces that shape who we are

Most of us believe that our personalities are consistent. Though we may be tempted to stray a bit from time to time, we tend to remain recognizable, character-wise, for most of our lives. We boast a unique and somewhat static combination of qualities and characteristics for decades; we are, in other words, a certain way. So, if the 40-year-old Eichmann could organize the mass murder of 6 million Jews, the 15-year-old Eichmann must have demonstrated some inclinations toward such type of action. Serial murderers, after all, are often sadistic toward animals as children; they are, to use an everyday phrase, born evil. People don’t change that much overnight. 

Amazingly enough, this is not at all true: people can change fantastically much over the course of an hour. Just imagine yourself talking to a close friend on the subway while on your way to work. Now, imagine that in the middle of your conversation, on the very next station, your boss unexpectedly hops the subway. Will you continue behaving the same way or start behaving a bit differently? What if it is not your boss, but a friend of your parents with a young child? Would you adjust your behavior once more or act the same way you would around your friend and your boss? Which one of the three yous is actually you?

Of course, you might say that you are “more” you around some people, and “less” you around others, but that is not the point. The point is that your character is not fixed and that who you are depends on the circumstances and the social context. That’s what enraged people about Eichmann: what Arendt was essentially saying is that if this man was not living in Nazi Germany, he would have probably been a respected member of some community, a loving husband and an even better father. Milgram’s experiment and Zimbardo’s experiment demonstrated that the opposite was true as well: under certain circumstances, you, of all people, could become an Eichmann or a Hitler, just so that you can belong somewhere, just so that you can accommodate the expectations of other people.

Investigating social dynamics

Though people differ from each other, some circumstances produce unnervingly similar effects in most of us. Under the following conditions, most people are susceptible to fall prey to the wolves howling in our hearts and change so profoundly to not be recognizable even to themselves:

  • Deindividuation. According to a Vietnamese saying, “in order to fight each other, baby chicks of the same mother hen paint their faces different colors.” In other words, when disguised or in a role, we are more inclined to act in a way not aligned with what one might deem “expected” from us. This is what happened in the simulated environment of the Stanford prison experiment or the real-life case of the Abu Ghraib torture and prisoner abuse. At a more tolerable level, it is also what usually happens to a person who moves from one country to another, where nobody knows them – unrecognizable, they often use the opportunity to reinvent himself.
  • Dehumanization. If deindividuation makes the perpetrator anonymous – “thereby reducing personal accountability, responsibility, and self-monitoring” – dehumanization “takes away the humanity of potential victims, rendering them as animal-like, or as nothing.” Just think of all the ways that the Nazi propaganda machinery described the Jews or how the African American slaves were described in the 19th century: it’s much easier to treat human beings without any respect – and even take their lives – if they are routinely called by everybody “pigs” or “apes”... isn’t it?
  • Obedience, conformity, and moral disengagement. The Milgram experiment says it all, really: fatherlike authorities often force people into obedience and conformity to such extents that they might become morally disengaged from their actions. In other words, if there is someone who knows the rules, people would rather follow them blindly than question them – even when they suspect the rules might be provisional or plain wrong. 
  • The evil of inaction. In 1964, Kitty Genovese was killed in plain sight of her neighbors: 38, respectable, law-abiding New Yorkers. Social psychologists soon discovered the reason: apparently, the more people witness an emergency, the less likely any of them will intervene to help. Once again, this is due to our nature to surrender our responsibilities in the presence of other people who might also have them. After all, why should one get involved in “someone else’s business”? But, then again, if everybody thinks that – who’s going to stand up to evil?

Final Notes

Even though “The Lucifer Effect” was originally advertised as the first detailed firsthand account of Philip Zimbardo’s groundbreaking 1971 Stanford Prison Experiment, this exceptional book is so much more – not the least because, in the author’s words, it strives to be a chronicle of his lifelong attempt “to understand the processes of transformation at work when good or ordinary people do bad or evil things.”

Dubbed “important” and “powerful” by most publications, Zimbardo’s findings will probably change forever the way you think about yourself. In its original review, The Times wrote that “all politicians and social commentators” should read “The Lucifer Effect.” We beg to differ: this book should be required in every single classroom.

12min Tip

“The only thing necessary for the triumph of evil,” wrote British statesman Edmund Burke in a letter addressed to Thomas Mercer, “is for good men to do nothing.” So, do something. We suggest the following four things:

Accept responsibility for your actions, even when you are not required to. 

Question all ideologies, especially the ones that seem to conceal an agenda.

Defy unjust authorities.

Put other people before yourself, at least from time to time.

Sign up and read for free!

By signing up, you will get a free 3-day Trial to enjoy everything that 12min has to offer.

or via form:

Who wrote the book?

Philip Zimbardo is an American psychologist and a professor emeritus at Stanford University, best known for his Stanford prison study. An author of various introductory psychology books and textbooks for college students, including the New York Times bestseller “The Lucifer Effect”... (Read more)