Docsplainin' -- it's what I do

Docsplainin'--it's what I do.
After all, I'm a doc, aren't I?



Saturday, January 24, 2009

Blind Obedience

or, Some Things Never Change

Umpty-ump years ago, Stanley Milgram, a Jewish psychologist interested in better understanding the Holocaust, began a series of experiments to see just how far ordinary citizens would go in obeying orders to harm another person. In the best-known of these demonstrations, 65% of his study subjects shocked (as far as they knew) another human being clear into unconsciousness and (for all they knew) possibly even death. This, even in a situation in which it was perfectly safe to tell the authority figure where to stuff it. As one of my students put it the other day, "People are sheep."

However, as Miller (2009) notes, over the entire series of experiments (there were 16 in all), 60% of Milgram's 540 subjects disobeyed orders at some point. So not all people are sheep, and some are sheep only up to a point.

Reasons (some) people are sheep (up to a point) include that, in our culture at least, we are socialized to obey authority. Children who mind well are valued, whereas those who don't are punished or taken to therapy or both. We even have a diagnostic code for particularly willful little brats: We label it a mental illness, Oppositional Defiant Disorder. I would also submit that the child who obeys is the child who survives to maturity and reproduces. Children who play with fire, run with scissors, and dart out into traffic do not. It's that simple.

We don't, of course, raise our children to be cruel. This underscores Milgram's point, which was that it is really the situation which is the most powerful factor in producing destructive obedience, not our national character or our individual personalities (Blass, 2009). This also explains why people, you and me included, will predict based on self-knowledge that they would never do something like this, and then most will, if conditions are right. You and I might, too. The trouble is that we don't know as much as we might about what situational variables have what effects. In any event, we can't control situations. We can, at least potentially, control our reactions. But to do that, we need to know more. We need to know what characterological traits enable us to prevail over even the most powerful situations. But let's get back to the list of reasons why we act like sheep sometimes.

Sheep-ish behavior can be induced by degrees. In the Milgram studies, as is often the case in real life, people eased into bad behavior--in this case, 15 volts at a time. What's 45v when you've just administered 30? ...165 when you've just administered 150? Hitler didn't start out saying, "You're gonna personally slaughter 6 million Jews and several millions more gays, gypsies and people with disabilities." No. He started out just talking bad about them, which got people ready to make them wear the Star of David or a pink triangle or whatever on their sleeves, then he took away some relatively minor civil rights, working his way up to Kristallnacht and such. He crept up on rounding them up and killing them, one small step at a time. And the sheep--whoops! I mean, people--went along with him. Not everybody, of course, but enough so that he could get the job done.

Sales people are trained in a similar technique: Get the consumer to say "yes" to as many questions as you can ("wouldn't you prefer a car that is reliable? beautiful? American-made?" Well, of course) before you spring the ultimate question ("Will you buy this car?") on him and he will say "yes." It's a tried-and-true technique. Milgram got people to say "yes" to 15v and worked them up 15v at a time until they were administering potentially lethal shocks using this method. Hitler got people to say "yes" to talking ugly about Jewish moneylenders and worked his way up to "yes" to the cold-blooded slaughter of unarmed women, innocent children, and helpless cripples. It was easy. And it only took a few years to talk a couple of million people into this madness.

Which brings us to #3. We tend to do what the rest of the flock group is doing. It's called group think and many, many studies have documented just how distorted (and extreme) our thinking can get when the group not only doesn't offer a corrective point of view but is actually leading us to the fringe. Fourth, when somebody is telling you to do something, you can slough off responsibility. "I was only following orders" echoes down through history as justification for all kinds of moral failings, ethical violations, and crimes against humanity, from Andersonville to Abu Ghraib. "Baaaa," quoth my student. The sheep will follow the goat.

Most interesting, however, is the sloughing off of responsibility onto the victim. Milgram found that subjects who obeyed ascribed twice as much of the responsibility for what happened to the victim, compared to subjects who resisted (Blass, 2009). Milgram thus may have been the first scientist to document the blame-the-victim phenomenon which Hitler so ably exploited in turning his countrymen against the Jews.

In Jerry M. Burger's (2009) replication of the Milgram studies, which just landed in my mail box last week, one variation in the experimental condition was to have a confederate model noncompliance. This had nearly zero effect on rates of obedience among experimental subjects. However, when two confederates modeled noncompliance in the original Milgram studies, experimental subjects also refused requests to shock their victims any further. It seems that if there is a group of people doing the right thing, it becomes easier for us to resist malignant authority--it gives another, perhaps more palatable, group to join. We don't have to give up the safety of the group in order to follow our conscience: The new group presumably offers us more or less the same benefits as the old one.

Unfortunately, in the real world, there may be no effective individual models for resistance, never mind groups. Evil organizations tend to screen those folks out where they don't self-eliminate, so that for the individual at a moral or ethical decision-point the only available behavioral models may all be torturers and murderers who furthermore are skilled at socializing the novice into the group pathology. An example would be a corporation headed by a sociopath. He (and the stats are that it usually is a "he," guys, so don't get your dander up) will intuitively, if not consciously and deliberately, hire like-minded folk. Who will in turn hire more of the same to work in various corporate departments and branch offices. The occasional non-psychopath who accidentally gets hired will either leave on her/his own, get co-opted (i.e., turned into a sociopath), or be run out of the company. Or sit silently by while evil is done (Miller, 2009). So what you will wind up with over any extended period of time is a company full of sociopaths. But I digress.

Evolutionary psychologists would say that we are hard-wired to feel, think, and behave as a tribe, and it has obvious short- and long-term survival value for us to do so. Think about it: There's not much with a lower potential for long-term survival and reproduction than a single hominid alone on the vast African savannas, is there? Indeed, a half a million or so years later, among certain Plains Indians the most severe punishment available was to run an individual out of the tribe--an almost certain death sentence. And of course even today in dangerous professions such as police work or military service, "going along to get along" has immediate survival value. You have to be able to count on your comrades having your back (see Benjamin and Simpson, 2009, for further discussion).

Stage hypnotists know that the very fact of a subject's volunteering, the parameters of which act are defined by the way the invitation is framed and delivered, guarantees better (e.g., more easily hypnotizable or compliant) sheep--dang! I did it again--subjects for the show than if one picks them out of the audience oneself. Similarly, rates of obedience in an experiment or an SS unit may be higher than they would be in the population at large. Miller (2009) also makes this point.

Given all this, the wonder is that 35-60% of Milgram's subjects stopped when they did. Indeed, as Lee Ross (1998; cited in Benjamin & Simpson, 2009) wrote, "The Milgram experiments ultimately may have less to say about 'destructive obedience' than about ineffectual and indecisive disobedience" (p. 16).

Of course in Milgram's studies, nobody was really getting shocked, therefore, presumably, no one was harmed. Still, there were ethical concerns: It was pretty stressful for the study's participants, and we just don't do that any more. Consequently, in order to "replicate" the study, one would have to change the procedures pretty drastically.

Burger (2009) has done just that. His study stopped after participants thought they had delivered 150 volts to the confederates, whereas Milgram's study went to 450v. Burger selected this stop point because in Milgram's lab, most people who quit stopped there, while most people who crossed that line continued to the bitter end. Burger also screened out people (the depressed, the anxious, the traumatized) who might be harmed in the study. Alan Elms, who assisted in the original study, calls this "obedience lite" (2009, p. 33), more about which, later.

The question was, has anything changed? Are we more aware of the danger of blind obedience than we were 50 years ago? Would it make any difference if we studied women, too? Burger found that (1) no, it hasn't, (2) no, we apparently aren't, and (3) no, it doesn't. The latter probably will not surprise students of the Holocaust, who know that some of the most notorious concentration camp overseers were women SS. And some of you may recall that the poster child for Abu Ghraib was a girl.

Jean Twenge writes that, according to her research, young people today should definitely be far more likely to defy authority than a bunch of middle-aged white guys would have been in Milgram's day. She interprets Burger's data as showing a trend, albeit nonsignificant, in the direction of increased resistance to authority, and she points out several aspects of Burger's study which may have inadvertently suppressed further evidence of change (Twenge, 2009). Unfortunately for her argument, Blass, (1999, 2004; cited in Blass, 2009) did a meta-analysis of 25 years' worth of studies and found a "near zero" (p. 43) correlation between the time the study was conducted and the obedience rate.

Also, Twenge's analysis relies heavily on the sampling differences, and for Burger to use a sample more like Milgram's would have, in my opinion, rendered it ecologically invalid. We want to know what college-educated, ethnically diverse, gender-inclusive police and army units would do in the face of an illegal order. We want to know what a melting-pot populace like ours, asked to go along with a Guantanamo or a Patriot Act, would do. With the possible exception of corporate boards, exclusively white male groups like Milgram's study sample are, after all, scarce on the ground any more.

Twenge (2009) also argues that Burger's study confounds obedience with violence and lack of empathy. She notes that the increase in narcissism over the intervening generations since Milgram's day and the desensitization to violence from tv and video games may account for part of the "obedience" rate Burger found. The empathy argument won't wash, however, as Burger tested for empathy (and consequently, indirectly, for narcissism) in his subjects and found this trait not predictive of compliance/noncompliance. As for the violence argument, this is splitting some exceedingly fine semantic hairs. Who cares what we call it? The end result is the same. People will still hurt other people when told to and/or given the opportunity to do so. Baaaa. I am sure that from Andersonville to Auschwitz to Abu Ghraib, plenty of narcissists and violent people took full advantage of the opportunity to act out, side by side with sheep-ish folk who were merely "following orders."

A stronger argument is probably that of Elms (2009), who pointed out that by stopping the study at 150v, Burger effectively eliminated the most critical aspect of the original. And as a result, Elms argues, we cannot say whether people would still hurt, injure, and possibly even kill each other under orders. Maybe if the victims were still writhing, screaming, and finally falling silent, the increased independence that Twenge has documented in her work would have asserted itself and Burger's rates of disobedience would have been much higher. Or not: Burger's subjects were still arguably well within their comfort limits, but on the other hand, discomfort didn't seem to slow down many of Milgram's. We may never know, because the likelihood of getting such a study past an IRB these days is just slightly less than nil.

Elms and Twenge both express concerns that screening out people who would be "upset" by the study reduced disobedience. But that dog won't hunt, either. I would argue (again) that first, being upset did not appreciably contribute to disobedience in Milgram's original studies: In fact, one of the most notable findings was the degree to which people could be upset and still continue on with the experiment. Second, police and military organizations take great care to screen out the anxious, the depressed, the previously traumatized. If we want ecological validity in a study predicting what a soldier might do in, say, a My Lai-type situation, then we need to screen out the folk who wouldn't likely make it into an overseas combat unit in the first place. Works for me.

So while the "lite" factor is a big problem, I don't buy the other criticisms. From my point of view, the biggest disappointment is that we didn't learn something more about, to paraphrase Ross, effective, decisive resistance. Elms (1972; Elms & Milgram, 1966; as cited in Elms, 1990) conducted extensive psychological testing on subjects from Milgram's series who had obeyed or not obeyed, in an attempt to discover some differences. He found only nonsignificant correlations between obedience and authoritarianism.

As noted earlier, Burger had thought that maybe the degree of empathy a person generally has for others might make a difference, or the degree to which control was important to them, but neither personality trait proved to be a predictor. Neither were such demographic factors as gender, age, or level of education. The different ethnic groups weren't big enough, statistically speaking, to test.

So we still don't know why approximately 1/3 to 2/3 of people will refuse to obey an illegal order. To me, that would be the most useful information of all, and I hope somebody, somewhere, is working on that. Burger's was an elegant piece of work, and one can only hope that there will be lots of replications with variations intended to tease out what it is that allows some people to "just say 'no'" to human cruelty.


References

Benjamin, L. T., Jr., & Simpson, J. A. (2009). The power of the situation: The impact of Milgram's obedience studies on personality and social psychology. American Psychologist, 64, 12-19.

Blass, T. (2009). From New Haven to Santa Clara: A historical perspective on the Milgram obedience experiments. American Psychologist, 64, 37-45.

Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64, 1-11.

Elms, A. C. (2009). Obedience lite. American Psychologist, 64, 32-36.

Miller, A. G. (2009). Reflections on "Replicating Milgram" (Burger, 2009). American Psychologist, 64, 20-27.

Twenge, J. M. (2009). Change over time in obedience: The jury's still out, but it might be decreasing. American Psychologist, 64, 28-31.

1 comment:

Lisa Jeffery said...

"Virginia, Thank you for this fascinating and thought-provoking article. I read it with with great interest; it sparked several thoughts in my mind. It is very similar to what I teach in my persuasion class, in helping students understand why propaganda can be effective. Have you ever read the study by Elisabeth Nowell Neuman called "The Spiral of Silence?" (circa 1970s) She found that people fear isolation from the group MORE than they feel their desire to speak up. This spiral explains how people get caught up in the Nazi Holocaust, Klu Klux Klan, and others. Even though they see moral atrocities happening, people will go along with it to avoid the punishment, isolation, maybe prison, maybe death. Of course she said,we need the Nelson Mandelas, the Rosa Parks, etc. to stand up... knowing they have a price to pay. Only then will others join them.Your article also made me think of a haunting video "The Stanford Prison Experiment" done in the 1970s. Have you seen it? Thanks!"

ShareThis