Of course, the true intention of the study was not to gauge the ability of the “learner” to remember pairs of data, indeed, the actor/researcher in the role of “learner” would repeatedly make intentional errors to illicit the “punishment” response from the “teacher.” What was really being studied was the willingness of the “teacher” to administer what they believed to be intense and dangerous shocks, some up to 450 volts (115 volts being the power of the average wall socket). The participants as “teachers” must have known of the danger of the voltage since buttons were equipped with labels such as “slight shock” and “danger: severe shock.” (Ibid.) Furthermore, though the “learner” was actually not being shocked at all, the actor would scream. At 120 volts, the “teacher” would hear the “learner” cry that the shocks were becoming to painful and, at 150 volts, the “learner” would demand that the experiment come to a halt. Eventually, the “learner” would refuse to communicate a response, but the researcher in the room would inform the “teacher” that this should be counted as an error and shocks should continue. Participants would often ask the present researcher things like “is this safe” or “shouldn’t we stop?” but the researchers would calmly reply: “You have no choice, you must continue.” (337) The question was: at what voltage level would participants quit, refuse to continue, or simply leave? Before starting the experiment, Milgram asked this very question to a sample of middle-class adults, a group of Yale psychology students, and a panel of psychologists, who all believed only about 1% of participants would administer severe shocks (338). In fact, in Milgram’s standard experiment, 65% of participants—“normal” people demographically speaking—would obey all instructions and administer extreme shocks (337). This is an extremely disturbing finding. Perhaps you are sure that you would refuse to shock someone to death just because you might be urged on by someone with a slight bit of authority over you (like a researcher). This means, statistically speaking, next time you’re stuck on an airplane in the middle seat, both the people at your side would be entirely willing to administer a sever shock onto someone like you. These 65% of participants could vote, in a landslide, for a candidate who they would then follow completely, regardless of the marching orders. If Nietzsche was right that 100 men created the Renaissance and can save humanity from any cultural drought, it is still probable that 65 percent of them are potential Eichmanns. Furthermore, by slightly altering the circumstances, Milgram found that up to 90% of participants would continue to follow orders if they had a greater psychological distance from the victim (for example, by relaying, but not singularly fulfilling, the order to administer shocks). This relates to the Dr. Phil show since, numerous times every segment, Phil tells the guests that they should, or must, do something to “improve” their life. Particularly in the final segment, Phil extends the same advice to his willing audience—both in the studio and at (the psychologically distance of) home. From a social psychological standpoint, Herr Phil is the diabolical experimenter, counting on the fact that his followers will blindly obey his orders whatever the costs. Of course, the advice might be good, but it might also be embroiled in personal biases, partisan ideology, and individual flaws, broadcast throughout the world. Traditionally, the role of the analyst is to lead the subject to self-awareness and positive, conscious choices, not to issue commands and edicts. If someone stops drinking, beating their spouse, or molesting children simply because an authority told them to, is that real progress and a solution, or is it simply covering one disturbing psychosis (i.e. alchoholism) with another (i.e. rash obedience) that may seem innocuous but has been used to explain massacres and holocausts alike. Dr. Phil, of course, is not Eichmann any more than Stanley Milgram is. It is us, the viewers, the potential participants and “teachers,” who have the dangerous potential to obediently follow directions without thinking for ourselves. When we listen to authorities—like Dr. Phil, Nazi leaders, or researching academic—and do whatever they say, from a psychological standpoint, we are listening to our fellow participant’s screams, yet continuing to shock them to death. Of course, that is simply from a psychological standpoint. From an ethical standpoint we might wonder whether it’s better to be shocked to death than to continue living with the shocking fact that our individuality, our free thought, and our personal agency has been dead all along.*At Deconstructing Phil, we always strive to bring you first hand accounts from the writings of prominent philosophers, psychoanalysts, and theorists. While Stanley Milgram is an influential psychologist and his book Obedience to Authority does provide detailed and direct accounts of his famous experiments, all four copies of the book were checked out when I checked in my local university library. Mook’s textbook, however, is a fairly detailed, objective, and accurate look at some of psychology’s most notable experiments. Secondly, it should also be noted that there were serious ethical challenges to Milgram’s experiments. These concerns deal with the circumstances and awareness of the subjects, though, and do not mitigate or call into question the ultimate findings. Lastly, while this post is already long and detailed, it should be noted that another similar experiment which goes a long way toward explaining the interaction between authority and obedience is the Stanford Prison Experiment (1971).