Beyond the Shock Machine
“Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments” A book by Gina Perry
In the summer of 1961, as the televised trial of Adolf Eichmann riveted the nation, hundreds of men responded to an advertisement in The New Haven Register seeking participants in a “scientific study of memory and learning.” When volunteers arrived on the Yale University campus, they were assigned either the role of “teacher” or of “learner.” The study, a researcher in a lab coat explained, was exploring how punishment might affect learning. A “teacher” was instructed to take a “learner,” seated in an adjacent room, through a series of simple memory questions. At each incorrect answer, the teacher was told to press a lever on a machine to administer a shock to the learner. To better spur concentration, the strength of the shocks increased after each error. There were 30 switches on the shock machine, beginning with a tame 15 volts and proceeding all the way to 450 volts, with visible warnings that included “danger: severe shock” and finally an ominous “XXX.”
The study had nothing to do with memory or learning. Aside from the volunteers assigned the teacher role, all people involved were actors. The real topic of study was obedience: Would someone administer what they believed were painful, perhaps lethal, shocks to another person, for no reason other than being told by an authority figure to do so? In the most famous version of the experiment, the learner stated that he had a heart condition while being strapped in, but was told by the researcher not to worry. As the shocks progressed, the learner’s grunts turned to yells; soon he was screaming about heart pain and demanding to be released. By the end he failed to make a noise, even when supposedly receiving powerful jolts.
When volunteers verbally protested, the researcher fed them a series of statements, such as, “The experiment requires that you continue” or “You have no other choice.” As the voltage increased, volunteers trembled, sweat and let out shrieks of nervous laughter, but were assured that Yale would take full responsibility for the well-being of the learner. According to psychology professor Stanley Milgram, who dreamed up the experiment, nearly two-thirds of the volunteers continued administering shocks to the maximum voltage, even after the learner went silent.
The obedience experiments showed, Milgram later told “60 Minutes,” that if “a system of death camps were set up in the United States of the sort we have seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town.”
The results of the experiments were captivating, hinting that we all carry the latent seeds of Eichmann in our DNA, and it’s no surprise that the findings, initially published in an obscure academic journal, took only a few days to land on the pages of The New York Times. There were some early critiques, questioning the validity of the results and the ethics of placing subjects under extreme duress, but this was a freight train of a finding, and it flung off any detractors.
Now, some 50 years later, Australian psychologist Gina Perry has re-examined the experiment to determine what exactly happened, and what it might mean. As she writes in “Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments,” “The standard account of Milgram’s experiments suggests that ordinary people can be manipulated into behaving in ways that contradict their morals and values –that you or I could be talked into torturing a man. But could we?”
The Milgram experiments become an overriding obsession for Perry. She tracks down participants, interviews graduate students who worked on the project, and rummages through Yale archives. Her doggedness pays off in many ways: She reveals that a number of participants claimed they didn’t actually believe the shocks were real; that Milgram failed to publicize segments of his research that undermined his dramatic findings; and that in more than half the experiments — Milgram conducted 24 variations — the majority of people didn’t obey the orders.
Especially fascinating is her account of the disturbing history of “social psychology” experiments, which placed unknowing subjects in stressful situations in order to “reveal” something about human nature. Though Milgram would come under heavy criticism for subjecting people to immense stress — he waited nearly a year to tell most volunteers that the shocks they administered were fake — at the time it was common to overlook the trauma such experiments might cause. Most researchers were amazingly cavalier: babies (often non-white, often orphans) were plunged into cold water, startled with loud noises, and placed alone in dark rooms; smoke was pumped into classrooms to see how students would respond to an emergency; Army recruits were told that they had triggered an explosive device that had probably killed people. One positive consequence of the Milgram experiments was that they sparked a debate regarding this ethical blind spot and prompted reform within the profession. The trauma, as Perry discovers when visiting volunteers many years later, was quite real. “I actually checked the death notices in The New Haven Register for at least two weeks after the experiment to see if I had been involved and a contributing factor in the death of the so-called learner,” one subject told her. Another wrote, “When no response came … with the stronger voltage I really believed the man was probably dead.” But these accounts, like much in the book, further muddy the water, making it hard to draw conclusions. If the experiment seemed real enough that a number of people thought they might have actually killed someone, what to make of those that claimed they thought it was all a setup? Might some simply be covering their ass? If I had run the circuit of the shock machine and then learned it was fake, I’d likely claim that I knew it all along as well.
And what about the link to Eichmann, with Milgram once referring to those who obeyed as “moral imbeciles” who could staff “death camps”? As Perry writes, “How could Milgram have measured destructive obedience, the authors asked, if his subjects saw the experimenter as a benign authority? Didn’t they naturally perceive the lab as a safe place, and the experimenter’s imperviousness to the learner’s cries as evidence that they weren’t really inflicting pain?” Perry cites critics, such as Diana Baumrind, who attacked Milgram’s dramatic claims about Nazis in New Haven: “The social psychologist’s laboratory, [Baumrind] argued, could hardly replicate a ‘real-life experience’ such as Nazi Germany. His subjects had little in common with SS subordinates: the SS man was likely to regard his victims as subhuman and believe both he and his superior officer were working together for a ‘great cause.’ The guilt and conflict of Milgram’s subjects were further evidence that the parallel between the Yale laboratory and concentration camps was weak.” Another academic, Don Mixon, argued that the project didn’t reveal our readiness to commit immoral acts, but instead the faith we place in experts. “People go to great lengths, will suffer great distress, to be good,” he told Perry. “People got caught up in trying to be good and trusting the expert. Both are usually thought of as virtues, not as evils.”
The greatest shortcoming of the book is that Perry doesn’t interrogate such claims. It is too bad that Baumrind could not read Christopher Browning’s extraordinary 1992 book, “Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Poland.” Browning recounts how a group of non-military, non-ideological, older family men, who “went through their formative period in the pre-Nazi era,” were turned into killers, despite being tormented by guilt and conflict.
And since when has trusting experts been considered a virtue? And based on Perry’s interviews and archival digging, plenty of participants believed they were inflicting pain, and they inflicted such pain precisely because they viewed Yale as a benign authority. Milgram, whose Jewish parents fled Eastern Europe, sought from the beginning to link his findings to the Nazis. In that he failed. But he shed light on something subtler: our disposition, once we have identified authority figures as benign, to trust and follow their orders, even when they appear reckless and cruel. Soon after Milgram’s experiment, we should remember, “experts” led our country into a disastrous war in Vietnam, and more recently, “expert” bankers (and regulators) played a central role in driving our economy into the ground. Many subjects in the experiments, faced with a very confusing situation, decided in the end to trust the expert. Surely, they reasoned, the experts won’t let things get out of control. History tells us otherwise.
Gabriel Thompson has written for The New York Times, New York, The Nation and Mother Jones. His most recent book is “Working in the Shadows: A Year of Doing the Jobs (Most) Americans Won’t Do.”WAIT, BEFORE YOU GO…
If you're reading this, you probably already know that non-profit, independent journalism is under threat worldwide. Independent news sites are overshadowed by larger heavily funded mainstream media that inundate us with hype and noise that barely scratch the surface. We believe that our readers deserve to know the full story. Truthdig writers bravely dig beneath the headlines to give you thought-provoking, investigative reporting and analysis that tells you what’s really happening and who’s rolling up their sleeves to do something about it.
Like you, we believe a well-informed public that doesn’t have blind faith in the status quo can help change the world. Your contribution of as little as $5 monthly or $35 annually will make you a groundbreaking member and lays the foundation of our work.