skip to main |
skip to sidebar
In the blog lifespan of every postmodern critic of Dr. Phil, there inevitably comes a time when one must come to terms with a tragic and inhumane aspect of modern life: genocide. At first glance, it seems unreasonable, illogical, and even disrespectful to equate any of Dr. Phil’s actions—however misguided or despicable—to the large scale, calculated, and heinous instances of ethnic cleansing in the Holocaust, the Armenian genocide, and more recent or ongoing conflicts in the Balkans, Chechnya, or Sudan. If it is a question of moral blameworthiness, legal culpability, or adverse cultural impact, obviously Hitler, Slobodan Milošević, and others would have to take the cake. But Dr. Phil does not claim to be an International Court of Justice judge, or even an unimpeachable icon of lucid moral propriety. Instead, Herr McGraw claims his amorphous right to be broadcast into our homes and minds because he is a trained Doctor (Ph.D.) of psychology. This is a bit problematic because, psychologically speaking, there are more than ample grounds to equate Dr. Phil’s furor for helping Americans with the Führer’s goals of helping the Aryan race. A good case study of the psychology of Genocide is found in Hannah Arendt’s Eichmann in Jerusalem which details the political life and subsequent trial of a high ranking Nazi official who had orchestrated the deportation, ghettoization, and eventual extermination of millions of social, political, and ethnic undesirables. One might expect Arendt to find a plethora of evidence that Eichmann was a crazed psychopath, a rabid Anti-Semite, and, above all, an extreme exception far outside the normal spectrum of human society. In fact, Arendt finds quite the opposite. Arendt writes: “the trouble with Eichmann was precisely that so many were like him, and that the many were neither perverted nor sadistic, that they were, and still are, terribly and terrifyingly normal…this normality was much more terrifying than all the atrocities put together.” (253). Even more striking are Arendt’s observations that Eichmann “was obviously no case of insane hatred of Jews, of fanatical anti-Semitism or indoctrination of any kind. He ‘personally’ never had anything whatever against Jews.” (22-23). The question then becomes, if Eichmann was not an abnormal sociopath, an ardent Anti-Semite, and a atypical brute among men, how could this unexpected characterization possibly be reckoned with his role as the architect of the Holocaust? Arendt’s response, though potentially valid and accurate, is far more disturbing than any act of Nazi barbarism as it works toward explaining—though in no way justifying—a wide array of modern monstrosities. Arendt’s answer is that the “long course in human wickedness” teaches not of aberrant psychopaths and bigots, but rather the overwhelming, subversive, and dangerous power of the “banality of evil.” (231). Arendt writes that, the judges overseeing Eichmann’s trial, like almost everyone involved, simply assumed that Eichmann was lying, the psychological reports were wrong—obviously the man on trial was insane and full of calculated hatred. By doing so, they missed the real issue, that “an average, ‘normal’ person, neither feeble-minded nor indoctrinated nor cynical, could be perfectly incapable of telling right from wrong.” (23). For Arendt, “his guilt came from his obedience, and obedience is praised as a virtue.” Inspired very much by Arendt’s writing, in the mid 1960s, social psychologist Stanley Milgram set out to create a set of experiments which would empirically calculate how far people would go to follow authority, including the “willingness to follow inhumane orders.”* (Douglass Mook, Classic Experiments in Psychology, 335). Milgram recruited participants using a traditional method: newspaper ads and posters with vague language inviting people to take part in a psychological experiment (Mook 336). Demographic data from each recruited participants was noted as they were brought into a laboratory setting. Participants were introduced to a second individual, who was introduced as another study participant, but who was actually an actor and a member of the research team. Participants were told they would be acting as the “teacher” while the second participant (actually an actor/researcher) would be the “learner.” (Ibid.) The “learner” was sent into a separate, but adjoining, room where he or she could be heard, but not seen. The participant believed the study tested the psychology of memory since the “teacher” conveyed a signal to the “learner” who would be required to remember and communicate back the correct item to complete a pair (Ibid). Participants were told that, as “teachers,” they would be required to administer “punishments” to the “learners” in the form of increasingly severe shocks at each wrong answer (Ibid).
Of course, the true intention of the study was not to gauge the ability of the “learner” to remember pairs of data, indeed, the actor/researcher in the role of “learner” would repeatedly make intentional errors to illicit the “punishment” response from the “teacher.” What was really being studied was the willingness of the “teacher” to administer what they believed to be intense and dangerous shocks, some up to 450 volts (115 volts being the power of the average wall socket). The participants as “teachers” must have known of the danger of the voltage since buttons were equipped with labels such as “slight shock” and “danger: severe shock.” (Ibid.) Furthermore, though the “learner” was actually not being shocked at all, the actor would scream. At 120 volts, the “teacher” would hear the “learner” cry that the shocks were becoming to painful and, at 150 volts, the “learner” would demand that the experiment come to a halt. Eventually, the “learner” would refuse to communicate a response, but the researcher in the room would inform the “teacher” that this should be counted as an error and shocks should continue. Participants would often ask the present researcher things like “is this safe” or “shouldn’t we stop?” but the researchers would calmly reply: “You have no choice, you must continue.” (337) The question was: at what voltage level would participants quit, refuse to continue, or simply leave? Before starting the experiment, Milgram asked this very question to a sample of middle-class adults, a group of Yale psychology students, and a panel of psychologists, who all believed only about 1% of participants would administer severe shocks (338). In fact, in Milgram’s standard experiment, 65% of participants—“normal” people demographically speaking—would obey all instructions and administer extreme shocks (337). This is an extremely disturbing finding. Perhaps you are sure that you would refuse to shock someone to death just because you might be urged on by someone with a slight bit of authority over you (like a researcher). This means, statistically speaking, next time you’re stuck on an airplane in the middle seat, both the people at your side would be entirely willing to administer a sever shock onto someone like you. These 65% of participants could vote, in a landslide, for a candidate who they would then follow completely, regardless of the marching orders. If Nietzsche was right that 100 men created the Renaissance and can save humanity from any cultural drought, it is still probable that 65 percent of them are potential Eichmanns. Furthermore, by slightly altering the circumstances, Milgram found that up to 90% of participants would continue to follow orders if they had a greater psychological distance from the victim (for example, by relaying, but not singularly fulfilling, the order to administer shocks). This relates to the Dr. Phil show since, numerous times every segment, Phil tells the guests that they should, or must, do something to “improve” their life. Particularly in the final segment, Phil extends the same advice to his willing audience—both in the studio and at (the psychologically distance of) home. From a social psychological standpoint, Herr Phil is the diabolical experimenter, counting on the fact that his followers will blindly obey his orders whatever the costs. Of course, the advice might be good, but it might also be embroiled in personal biases, partisan ideology, and individual flaws, broadcast throughout the world. Traditionally, the role of the analyst is to lead the subject to self-awareness and positive, conscious choices, not to issue commands and edicts. If someone stops drinking, beating their spouse, or molesting children simply because an authority told them to, is that real progress and a solution, or is it simply covering one disturbing psychosis (i.e. alchoholism) with another (i.e. rash obedience) that may seem innocuous but has been used to explain massacres and holocausts alike. Dr. Phil, of course, is not Eichmann any more than Stanley Milgram is. It is us, the viewers, the potential participants and “teachers,” who have the dangerous potential to obediently follow directions without thinking for ourselves. When we listen to authorities—like Dr. Phil, Nazi leaders, or researching academic—and do whatever they say, from a psychological standpoint, we are listening to our fellow participant’s screams, yet continuing to shock them to death. Of course, that is simply from a psychological standpoint. From an ethical standpoint we might wonder whether it’s better to be shocked to death than to continue living with the shocking fact that our individuality, our free thought, and our personal agency has been dead all along.
*At Deconstructing Phil, we always strive to bring you first hand accounts from the writings of prominent philosophers, psychoanalysts, and theorists. While Stanley Milgram is an influential psychologist and his book Obedience to Authority does provide detailed and direct accounts of his famous experiments, all four copies of the book were checked out when I checked in my local university library. Mook’s textbook, however, is a fairly detailed, objective, and accurate look at some of psychology’s most notable experiments. Secondly, it should also be noted that there were serious ethical challenges to Milgram’s experiments. These concerns deal with the circumstances and awareness of the subjects, though, and do not mitigate or call into question the ultimate findings. Lastly, while this post is already long and detailed, it should be noted that another similar experiment which goes a long way toward explaining the interaction between authority and obedience is the Stanford Prison Experiment (1971).
Today’s Dr. Phil show dealt with eating disorder, primarily anorexia and bulimia. I had expected it to be a particularly telling episode, especially from the previews, which showed Phil staring down an emaciated girl with the words “you are going to die...soon!” In fact, McGraw was far more reasonable and reasoned than normal. He went out of his way several times to say “it’s not as easy as saying: start eating,” and he did make several salient points. However, if there is one part of Phil’s logic and methods that needs to be addressed, surely it would be that he claims—in an apparent contradiction—that it is a problem that stems “from within” as well as being “driven by media images [and] media icons.” McGraw did not elaborate how such a relationship between the subject’s interior psyche could be related to a larger social consciousness, but luckily Freud did precisely this in his work Civilization and Its Discontents. Freud writes that “it was discovered that a person becomes neurotic because he [or she] cannot tolerate the amount of frustration which society imposes on him [or her] in the service of its cultural ideals.” (39). Considering only this idea, one could imagine how any or all of the four guests on the show could have become anorexic or bulimic because of society’s imposed cultural ideas. However, the fact that Freud writes “cannot tolerate” clouds the situation. The standard explanation of anorexia, incorporating Freud’s vocabulary when possible, would be: the subject feels society imposing the cultural ideal of skinniness, health consciousness, and so on, causing them to try and fulfill the objective to the extreme. But that is no longer Freud’s model. To him, psychosis arises not from the wish to fanatically fulfill society’s imposed ideals, but rather from the subject’s inability or unwillingness to tolerate such ideals. It would be more in line with Freud to say that these guests are, in fact, not enthralled by the media’s glamorous portrayal of youth, beauty, and tiny figures. Instead, from the very beginning of their psychosis, they found these images and ideals to be quite disgusting and deplorable. It was exactly this desire to not tolerate, to rebel, which drove them to the extreme, just so that they could prove to themselves, to their families and friends, to Dr. Phil, and to the whole world that the ideal is an extremely dangerous and perverse one. Engrained within their psychosis is a realization that Freud already understood, but Dr. Phil and the mainstream media are understandably reluctant to make: “this useless thing which we expect civilization to value is beauty.” (45). Dr. Phil can blame Nicole Ritchie and the media which fetishizes small sizes but, as Freud understands, that is simply a confined, contemporary manifestation of the problem and not the problem itself. As he writes: “the urge for freedom…is directed against particular forms and demands of civilization or against civilization altogether.” (49). Today’s guest, then, are obsessed and drawn into the values and images of the media at the exact symbolic location of their rebellious psychosis. Specific cultural values have always, and will always, continue to change, but the individual’s great need to “defend his [or her] claim to individual liberty against the will of the group” is an innate and unstoppable force and one which, not surprisingly, Dr. Phil cannot understand or articulate.