Talk:Milgram experiment/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Collaboration

On 30 August 2001, the very first organized collaboration to improve an article was started: a short "article-a-day" e-mail on the Milgram experiment sent during a simpler time when only 145 people had user pages. Thus this article was the very first precursor to the Collaboration of the week.

Source: Announcements August 2001
Eric Herboso 04:58, 21 Mar 2005 (UTC)

Invitation

The following was here previously. It is not understandable English, but perhaps someone might be able to find some information in it: --LDC

This experiment not only(also excellent films)showing how subordinate human psychic setup really being so any outside feedback reenforcing this already genetically prepared behavior is BUT modern complex societies doing a lot to turn people into perfectly automated beings that do not sense any contradictions any more. This is leading to such strange behavior that I introduced the term SOCIATRY here and if you want some more details visit. http://users.aol.com/archive1/s.html Not being able to voice any critique we can identify many developemental trends trying to cover up any of these absurdities. Want to work here,too? Feel free to contact me at: ARCHIVE1@aol.com To understand the strange goings-on i ran across doing and developing graffiti-research I have been in need of some meta-explanatory matrix to understand what was going on if I dared to look closer Axel Thiel Kassel Germany


Did they believe they were causing pain?

From the article:

As a hair-splitting aside it must be noted that the shocks 'inflicted' are in essense unitless. The experimenters claim that 450-volts is fatal is not entirely true (see electric shock). Without being told amperes the actual physiological effect cannot be determined. Therefore what the experiment reveals is the power of subjective belief over an ignorance of physics. Did no subject ask about the current?

Let's put it this way. 450 V is potentially fatal (in real life, much less will do the job), and the "experimental equipment" was marked with the word DANGER at that voltage. The question is: did the experimental subjects believe they were causing great pain or death? The answer is yes. The Anome 19:16 Apr 17, 2003 (UTC)

Moreover, in many experiments, the "victim" was supposed to scream, beg, plead, and eventually appear to pass out. --Andrew 09:45, Apr 27, 2004 (UTC)
This point seems quite critical to me. The key point, as I understand it was that the "learner" apparently passed out or died. In fact, the switches controlling the shocks were labeled as "Dangerous" or "XXX" in at least one of Milgram's experiments. Knowing that the "teachers" believed they were not only giving painful shocks but also had the potential to "kill" the "learner" seems essential to understand why they expected only sadists to continue through to the end. --Abqwildcat 00:49, 10 Jul 2004 (UTC)
I agree, personally. A while ago I gave the Methods section a rewrite which explicated the nature of the electric shocks and why the "learner" was reacting the way he did, but it got reverted. Maybe someone else should give it a try. Marblespire 06:20, 27 Jan 2005 (UTC)

Is it about obedience?

Maybe I don't understand. Maybe I missed something but I think there is room in this article for critics about Milgram's methodology and conclusions. If someone can find such information from the scientific community.

From my humble point of view(I am no psychologist), the experiment is interesting but seems to be deeply flawed for two things. First I guess that the 'teacher', in some way, is convinced that the experiment is not pleasant at all for the 'learner' but safe since the scientist wants it to continue. Is this the authority he is talking about? I think this is more related to the concept of trust than authority. The other thing, and this one is more important, is that the 'teacher' knows that the 'learner' is free to walk away. This is a strong point to convince the 'teacher' that he has no real responsibility in what's going on. I think that if the 'learner' were not free to walk away, the 'teacher' would stop quite early in the experiment.

As said in the article, Milgram devised the experiment to answer the question “Could it be that Eichmann, and his million accomplices in the Holocaust were just following orders?”. In the light of what I said, the two situations are really different. People involved in the Holocaust knew that the result of their actions was death. They also knew that all those human beings were not free to escape.

In fact, the "vicitm" was not (supposed to be) free to walk away - he was strapped into the chair; at one point he begged for release. The problem is exactly that the "teacher" lets the scientist take responsibility and trusts him not to harm the learner. This article is necessarily very incomplete; if you want the whole story, read Milgram's book. There are certainly differences between every version listed here and the actual Holocaust situation; nevertheless, the experiment casts obedience in a startling light. Milgram also points out that this sort of obedience may be necessary for social functioning. --Andrew 09:45, Apr 27, 2004 (UTC)
Still, it seems incorrect for Milgram to assume his results were from obedience to authority rather than other factors. For instance, it would be seriously illegal for the scientist to order you to kill or seriously harm another individual against their will. Since any of the "teachers" were surely aware of this fact at some level of consciousness, it seems that most if not all of them were simply able to convince themselves they were not truly hurting the "learner" because it would be inconceivable for something as immoral as that to be sanctioned by, for example, Yale University. And in fact, anyone who may have assumed this during the course of the experiment would have been correct.

— Elijah Gregory

Anticipating this objection, Milgram also ran a copy of this experiment in a rundown office building in Bridgeport, with no link to Yale or any other prestigious institution; it was said to be run by "Research Associates of Bridgeport" (a name concocted for this study). Obedience dropped somewhat, but 19 of 40 subjects still shocked the victim to the end. DanielCristofani 03:28, 15 December 2005 (UTC)
I don't know about that. Going from (slightly over) 2/3 to (slightly under) half seems like a noticeable enough change that it shouldn't be brushed aside with the phrase "obedience dropped somewhat". (Especially when it stays at a constant 61-66% regardless of time or place, yet this single change reduces it to below that.)
I don't think "somewhat" is brushing it aside. And many of the other variations Milgram did reduced the rate of obedience at least as much, although as you note, replicating the experiment in other countries etc. did not. DanielCristofani 17:52, 28 April 2006 (UTC)
And that only answers half of the objection, anyway; a subject couldn't assume that Yale sanctioned the experiment, but they could still assume that the experimenter was acting legally, the law did sanction the experiment, and that the law wouldn't sanction anything really immoral. Again, this assumption would be correct.Ken Arromdee 15:24, 3 April 2006 (UTC)
The assumption that "the law wouldn't sanction anything really immoral" is of really dubious accuracy. That assumption is, however, one of the main bases for obedience to authority. Remember, "authority" here means someone who is perceived as having the RIGHT to give orders, which in turn is usually grounded in some legal framework if you bother to trace the idea that far. The "bad" examples of obedience to authority, like the good ones, almost always involve people obeying orders from an authority whose right to give orders is (or is perceived to be) sanctioned by an accepted legal system.
When you keep saying "this assumption would be correct" and things, it sounds a bit like "the supposed 'learner' wasn't actually being hurt, so the subjects were justified in continuing to push the buttons"... which would make any experiment of this kind worthless unless someone was actually being shocked. The questions are 1. whether the subject had a good reason for thinking the 'learner' wasn't getting hurt, in the form of "prestigious universities do not do Very Bad Things" or "the US Government would not do or permit Very Bad Things", and more to the point, 2. whether the subject in fact believed the 'learner' was not getting hurt. Many of the subjects were convinced the learner was being shocked, but went on pushing the buttons anyway. DanielCristofani 17:52, 28 April 2006 (UTC)

DanielCristofani 17:52, 28 April 2006 (UTC)

For correlation between Milgram's experiment and Holocaust situation, see Zygmunt Bauman's Modernity and the Holocaust -- mz 10:43, 9 Sep 2004 (UTC)
This is part of the point. Surely a respected institution such as Yale wouldn't sanction this sort of thing if it was harmful? Surely the experimenter is a learned and responsible individual who would have no malign motive? Thus we proceed because it must undoubtedly be the case that authority knows best, and better than we do. --Ross UK 01:48, 28 April 2006 (UTC)
If the participant knew that Yale has a good reputation, and decided "Yale wouldn't sanction this sort of thing if it was harmful", that would be *correct reasoning*. The participant could not have known the particular reason why Yale thought it wasn't harmful (he couldn't have known it was fake), but in fact, if it was harmful, Yale wouldn't have allowed it, so a participant who deduces that is making an accurate deduction from a known fact about Yale.
(Without knowing the whole history of Yale-sanctioned experiments, I do at least want to note that there is still some dispute about whether Milgram's experiment was harmful to its subjects, and doing the same experiment would not be allowed today.) DanielCristofani 17:52, 28 April 2006 (UTC)
This is not the same as a blind trust in authority, because it is based on Yale's reputation, not merely on the fact that orders are being given.
"Authority" does not mean "the fact that orders are being given", it means "the fact that orders are being given which appear to be sanctioned by an institution with a good reputation". DanielCristofani 17:52, 28 April 2006 (UTC)
Yes, "blind trust" is a different matter. If someone tells you to do something then the average person might question it. If they are someone apparently trustworthy then the critical faculty appears impaired. --Ross UK 05:39, 29 April 2006 (UTC)
The same goes for "the law wouldn't sanction this if it was harmful". Ken Arromdee 14:35, 28 April 2006 (UTC)
And ditto here. DanielCristofani 17:52, 28 April 2006 (UTC)
Agreed. --Ross UK 05:39, 29 April 2006 (UTC)

Consider this scenario: You're working in the government. You are told to do something which will ultimately result in tax being collected from people who do not want tax to be collected from them and clearly consider this collection of taxes to be harmful to themselves. Does this imply a moral obligation not to collect those taxes or help anyone do so, and is collecting them just following orders?

Collecting taxes after you are ordered to do so is following orders, yes. By definition. The moral issue calls for more analysis. Assume that the original job description did not mention tax collection--just as Milgram's subjects were not told anything about killing a man. We can distinguish between:
1. Doing what authorities order you to do, regardless whether you think it is the right thing to do, because you see your own opinions as irrelevant (in Milgram's words, you have entered an "agentic state"). You may have an opinion, but if so, you do not act on it.
2. Using your own moral judgement to decide whether each order is legitimate or not, and obeying or disobeying based on your judgement.
We can further distinguish between:
2a. Deciding that what you have been ordered to do is morally acceptable, and therefore doing it.
2b. Deciding that what you have been ordered to do is morally unacceptable, and therefore not doing it.
In your example, many people will do 2a -- aware of the consequences to the taxed, they will still feel that taxation is morally justified, and will thus go ahead and collect taxes. Your question "Does this imply a moral obligation..." asks whether I would condemn this behavior. I would not.
Behavior 1 is what worries me, as it worried Milgram. His experiment was designed to distinguish 1 from 2, and later to explore social forces contributing to both behaviors. Unfortunately, behaviors 1 and 2a both consist of obeying orders, so it is hard to design an experiment to separate them. Milgram's workaround was to design the experiment so almost nobody would do 2a -- almost nobody would decide that it was morally justified to kill people in order to complete an experiment on learning. Then he ignored 2a, assumed obedient subjects are doing behavior 1 and disobedient ones are doing behavior 2b, and varied the context to figure out what makes people do one or the other. An awkward but fairly effective solution. DanielCristofani 05:10, 1 May 2006 (UTC)

You might try to claim that government officials are elected by the populace, but they might not have been elected by the particular individuals you are collecting taxes from, in which case it's still an obeying orders situation with the majority giving you orders to hurt a minority. Or you could claim that the minority agreed to participate in the political process as a whole and that once they have agreed it's too late to reject parts they don't like--but I could equally claim that people who agreed to an experiment in which they are shocked can't reject it.

These and other moral arguments are for people who are already engaged in behavior 2. The experiment is not about whether people see it as morally okay to kill the "learner"--we assume they don't. The experiment is about individual moral (and other) thinking getting replaced or overridden by obedience to an accepted authority, and what influences make that happen or not. (Incidentally, Milgram argues in detail that people do not want to kill the learner, and cites a subsidiary experiment where people were free to choose their own shock levels. He found that all but two subjects used only mild shocks when allowed to choose the levels freely. This number--two per group--is not a good estimate of the size of group 2a, for several reasons, but it is the closest thing we have.) DanielCristofani 05:10, 1 May 2006 (UTC)

I could come up with other examples. If you are working for a doctor, should you refuse to inject a patient with something that causes the patient visible pain, because you may not morally take the doctor's word that the injections are for the benefit of the patient?

If you decide to take the doctor's word, you are doing 2a. But Milgram deliberately made his "experimenter" hard to believe--he was designed to come across as completely rigid, behaving according to fixed rules rather than reacting appropriately to the situation. Of course this is necessary since he has to say and do the same things the same way for experimental validity, but it is also intended to make him unpersuasive. In response to any concerns about the learner's safety, for instance, he just says "the shocks may be painful but they cause no permanent tissue damage"--which sounds like a canned answer if the subject just asked a specific question about the possibility of heart failure, as many of them did. And it sounds even more like a canned answer in response to "What if he's dead in there?" DanielCristofani 05:10, 1 May 2006 (UTC)

I don't think "authority" means what you think it means.

Be specific. Do you mean you don't think "obedience to authority" means "obedience to someone who you believe has an institutionally-based right to give orders", but rather something more like "obedience to anyone who takes it into his head to give orders"? I know this sounds like a strawman but you said a blind trust in authority was "based merely on the fact that orders are being given"... DanielCristofani 05:10, 1 May 2006 (UTC)

Nor do I think that taking Harvard's word that the experiments are not harmful is relinquishing your moral responsibility, if your belief that Harvard wouldn't hurt anyone is based on your own moral judgments of Harvard's record and not on blind faith. Of course, if you misjudge Harvard you may hurt someone, but lots of misjudgments, including those that involve no orders, can lead you to hurt someone. Normally we don't say that that possibility makes the act of judgment inherently wrong. Ken Arromdee 17:03, 30 April 2006 (UTC)

The experiment is set up to look like it's going awry in a way that had not happened before, and that nobody in the administration had foreseen. Any subject who honestly took Yale's reputation as meaning no study approved by any of its subcommittees would ever go awry in a way that would kill a person, would be doing 2a. Although we actually can't separate the responses as cleanly as I've been saying--anyone who goes on shocking the guy is likely to rationalize it as being morally okay in one way or another.
Anyway, as far as the Bridgeport version, the possibility of subjects looking at the record of the US government and deciding that it would not allow a study to go on within its borders that could go awry in a way that would kill a person--well. I don't see a clear line between that and blind faith. DanielCristofani 05:10, 1 May 2006 (UTC)
Okay, I'm convinced my tax collection analogy wasn't good since taxes are clearly all case 2a, but I'm not convinced about the experiment itself. I think that the statement "Any subject who honestly took Yale's reputation as meaning no study approved by any of its subcommittees would ever go awry in a way that would kill a person" is an exaggeration. The subject would not have to believe that Yale never makes mistakes, they would only have to believe that Yale is no more likely to make mistakes than they themselves are.
As for the reputation of the US government, while trusting it too much is a mistake, there are degrees of trust and areas where it can be more or less trustworthy. For instance, someone could distrust the US government when racism or politics is involved, while trusting it more in other cases. And even if someone trusts the US government too much, there's a differnece between trusting too much and blind faith. I don't think it's a big enough stretch to count as blind faith--they may honestly have never heard of the Tuskegee experiment or whatever. Ken Arromdee 15:35, 1 May 2006 (UTC)
I just noticed I never answered this. Basically: Yes, that's clear, and I agree with you. DanielCristofani 07:09, 4 June 2006 (UTC)

i read in an old psychology textbook that the rate of obedience was inversely proportionate to the subject's level of education and increased with each year of military service.

Sounds interesting. A citation would be useful. -- The Anome 07:13, 12 Jun 2004 (UTC)

Picture

There is a note on WP:FAC that this article could use a picture. Would the sign from Electric shock do? -- Solipsist 06:25, 11 Aug 2004 (UTC)

I've only seen that sign once, in London, but it made me laugh very hard. I don't really think that it goes along with this article all that well, especially considering no-one is actually shocked. Acegikmo1 14:52, 17 Aug 2004 (UTC)

featured article removal candidate

Is there any reason that this article is being considered for removal? I think there's lots of promise in it, and if it was previously "up to snuff" so to speak, it shouldn't be hard to get it back to featured article quality.

I think just some constructive criticism along with a removal consideration would be useful. Anyone? --ABQCat 04:41, 8 Sep 2004 (UTC)

The person who put {farc} here has done a sloppy job of it by failing mention why. I'd give it 48 hours, and if no comment surfaces, just remove it. The article looks good to me. (A little thin in certain areas, but stll FAC-quality.) --Yath 04:21, 9 Sep 2004 (UTC)

I'm not the one who posted the notice, but, for starters:

  • Why does the lead paragraph refer to a 1974 source where this is mentioned in a book? The actual work was performed and published over a decade earlier, so in a lead this is absolutely misleading. I remember studying the Milgram experiment in Psych 101 in 1972: even then, it was already very famous.
Maybe because 1974 is when Milgram published the book "Obedience to Authority: An Experimental View" which describes the experiment and all its variations in detail, as well as Milgram's theories and interpretations about it. "summarized" is really the wrong word and I think I'll change it. DanielCristofani 03:28, 15 December 2005 (UTC)
  • Related: the article does not even give the original publication data for Milgram's original paper.
  • There is no discussion of the role this experiment had in the adoption of standards for human experimentation.
  • There is no discussion of how this fit in with Milgram's unusual and often flamboyant style of experimentation. It would be OK if that were discussed elsewhere and merely alluded to here with a "see also" but it's not in the article Stanley Milgram, either.

Again, I can't speak for the person who posted the notice, but I do feel this falls short of featured article quality. -- Jmabel 21:07, Sep 9, 2004 (UTC)

Insights into human psychology

Is it absolutely uncontroversial that the results of the experiment were indeed valuable insights into human psychology? I'm not saying that it is not, since I know next to nothing about the theme, but the article states that quite peremptorily, and controversial experiments often have their results (and their merit) contested by some sectors of the scientific comunity. So I was just wondering. Regars, Redux 21:28, 9 Sep 2004 (UTC)

Is anything in psych "absolutely uncontroversial"? This is certainly one of the most famous experiments in the history of the discipline, and is pretty standard fodder as part of Intro Psych courses. It certainly was a very surprising result, tending to invalidate a lot of previously held beliefs. -- Jmabel 22:51, Sep 9, 2004 (UTC)

"Participant" vs. "subject"

Is there any particular reason why the word "subject" is disused in favor of "participant" nowadays? Does it make sense to use the latter when describing something that used the former? --Taak 03:11, 11 Sep 2004 (UTC)

This was brought up on my talk page first. Here is the dicussion:
The featured article for today is the Milgram experiment. It is not entirely acceptable because contains gender bias such as the use of 'he' when the sex is not specific. It also contains the term 'subject' which is often regarded by psychologists today as an unacceptable term, the accepted term for that role now is 'participant'. Ironically the terminology is particularly important in the context of the Milgram experiment.
I made edits in the article itself, and in the subsection used in the template. Would it be possible to change it over on the basis of what I have said?
Many thanks. Bobblewik  (talk) 09:50, 9 Sep 2004 (UTC)
It is not entirely acceptable because contains gender bias such as the use of 'he' when the sex is not specific. - this is your opinion, which is not necessarily shared by other Wikipedia contributors. In formal written english, there is no 3rd-person gender-neutral pronoun ("they" is often used in informal situations, but some (myself included) find it inappropriate for Wikipedia). "He" is a generally accepted alternative, and is in fact recommended by more conservative style guides. Going around demanding that it be changed will likely get you labaled as a PC-pusher, and the way other people react to you will be unpleasant. (PC-pushers are, generally, not warmly recieved here)
As far as "subject" vs "participant" - I admit my knowledge of psychology is poor, so I can't speak with any authority on the subject -- however (as a native english speaker) it sounds like you are trying to whitewash the language. Substantively, I don't see a difference between the terms. But if you can give or point to a substantive justification behind it (give the URL of a psychology style guide somewhere that discusses the use of the words) then I don't think anyone will object. →Raul654 16:20, Sep 9, 2004 (UTC)
I am certainly not trying to whitewash the language, merely correct two flaws in the article that are not compliant with modern psychological guidelines. The British Psychological Society style guide. See '10.2 Inappropriate labels' for the subject/participant issue, and '10.1 Sex-specific language' for the issue about using 'he' when sex is not necessarily male.
The American Psychological Association (APA) also has a style guide. I understand that the subject/participant issue is in 'Chapter 2 Guidelines to Reduce Bias in Language' and it says something similar to: "Terms like participants,respondents, or students should be used instead of the term, subject". I cannot give you a URL, but it is referred to in the APA FAQ I've noticed that subjects is often changed in copyediting, most often to participants. Why?
Trying to help. Bobblewik  (talk) 18:59, 9 Sep 2004 (UTC)
Well, like I said, WRT subject vs participant, it looks like you have a fair case. After reading what you cited, I wouldn't have any objections. →Raul654 02:00, Sep 10, 2004 (UTC)

So it looks like the British Psychological Society and the American Psychological Association are whitewashing the language. I suppose we must endure it in articles about psychology? Annoying. --Yath 06:12, 13 Sep 2004 (UTC)

I wouldn't go so far as to calling it "whitewashing" but it is pretty lame. --Taak 19:09, 18 Sep 2004 (UTC)

I think it's more that psychologists have a negative reputation in society and calling subjects participants (which sounds much more "consensual") may give them an extra shred of credibility. Also, it's really an issue of what language they want to be used in psychology journals - just as engineers defined what "stran" vs "shear" means, psychologists are defining what "subject" vs. "participant" means, and choosing the correct term is necessary to avoid sounding dumb to other people in their field. Just look at the history of psychology and you find examples of experiments that could never be done today. The Stanford Prison Experiment springs to mind. Phil Zimbardo is one of the most esteemed psychologists in the US, but if he were to propose such an experiment again today, he'd be laughed out of the university human subjects committee. So, subjects vs. participants is a big deal because it gives the (correct) interpretation that modern psychology experiments are consensual. That said I believe the correct term for people who were in the Milgram Experiment is undoubtedly "SUBJECTS" because they don't conform to any reasonable definition of "PARTICIPANT" (i.e. they were not "willing" participants). Call modern psych studies "participants", but these folks are "subjects" by any sense of the word. --ABQCat 19:28, 18 Sep 2004 (UTC)

I think "participant" is an poor choice of word because it ambiguates between the subject and anyone else in involved in the experiment, such as the confederates and experimenters in this study. Certainly they participated in the experiment as well, and it makes it awkward to talk about those who are the subject of the study and those who aren't if forced to use this ambiguous term. --Taak 19:55, 18 Sep 2004 (UTC)

why unethical

the article claims that "Most modern scientists would consider the experiment unethical today". I can't for the life of me figure out what about this experiment is unethical, given that no one was actually shocked or harmed in any way. Can anyone explain? -Lethe | Talk

putting someone through a process during which they believe themselves to be possibly complicit in inflicting torture and at one point they are liable to believe they just killed someone? You see no ethical issue there? -- Jmabel 21:49, Sep 14, 2004 (UTC)

Whether anyone was "put through" such a process or "put themselves" through it needs to be answered first -- they were, after all, acting of their own "free will."

However, there is one large hole in this article that does have to do with ethics: what happened when the experiment ended? Were the subjects immediatly told the truth, or were they allowed to believe for the rest of their life that they tortured a man to death? The latter, I think we can all agree, would be unethical.

If someone knows the answer, please update the article. -Seth

As someone who has (yearly) been tested on ethics by my university's human subjects committee in order to conduct psychology experiments, let me clar something up. The largest and most glaring reason this experiment would be considered unethical if performed today is because the participants did not give informed consent as to what they were going to participate in. Some trickery is allowed in getting such consent (when too much information would obscure possible experimental results - common in social psychology experiments) but is reviewed by human subjects committees at universities. Afterwards, subjects are immediately debriefed and given full disclosure as to what they were doing, deceptions are revealed, and the participants are supposed to leave feeling educated but not tricked. The lack of informed consent with extreme trickery which wasn't governed by a human subjects committee is the primary reason it would be considered unethical today. --ABQCat 23:14, 29 Nov 2004 (UTC)

Okay, that makes more sence. I'm not sure how you could perform such an experiment without trickery, and I wonder how you could ever study the psychology of trickery and lying, but that's another subject and I suppose it's better to err on the side of caution. So, does anyone know for a fact that the subjects were immediatly told this was a trick? If so that ought to be added to the article. -Seth

I know for a fact that they were told so. However, some of them (as was found later) didn't completely understand all nuances of what happened. Can't give a reference, sorry, but I think I read it following links from this very article. Paranoid 00:15, 17 Dec 2004 (UTC)
They all met with the guy they were supposed to have shocked, so they knew he wasn't dead or anything, anyway. DanielCristofani 11:56, 5 November 2005 (UTC)

Incorrect on a number of points, I believe. (1) The subjects were not told anything about the "learners." They were not "debriefed," I believe is the correct psychological term, which is always done today in this type of experiment. In fact, the article specifically says that even the subjects who administered the lethal shocks left at the conclusion of the experiment without asking to see the "learner" or inquiring about his condition. The lack of debriefing is also a reason why this experiment would be considered unethical today.

What "article" says this? This is explicitly contradicted in Milgram's book:

A careful postexperimental treatment was administered to all subjects. The exact content of the dehoax varied from condition to condition and with increasing experience on our part. At the very least, all subjects were told that the victim had not received dangerous electrical shocks. Each subject had a friendly reconciliation with the unharmed victim, and an extended discussion with the experimenter...

This is in Appendix I, page 194 in my copy. DanielCristofani 03:31, 4 June 2006 (UTC)

(2) Furthermore, earlier on this Talk Page, someone mentions that the subjects could always assume that "Yale would never sanction a harmful procedure." One of the original subjects, who refused to administer more than a mild shock, wrote that he inferred very early that this procedure was clearly suspicious and illegitimate (partly, among other things, by the way the lots were given, deciding who would be the teacher and who the "learner") and concluded that it could not be as it was described to him by the "experimenter." This information can be gotten from the external link at the bottom of the article, "a participant talks about his experience." I may have gotten some details wrong, as I just read it, and am writing quickly, but take the time to read it; it is very interesting. I have a few points/questions of my own to add: 1. Firstly, and most important, the "results" of this famous experiment have never been analyzed with sufficient care and subtlety: The artificial nature of the entire set-up cannot be extrapolated to other situations in life without any limitations whatsoever. In particular, analogies with the Holocaust and other such situations involving individual conscience possess large differences with Milgram's experiment. I am not stating that his results are valueless; only that the cavalier way in which they are used to make further claims is frequently unfounded. I cannot go into all the differences here; suffice it to say that these subjects were not experiencing the same moral conflict as an SS man who is ordered to shoot unarmed women and children and kill them in front of his own eyes, or a US soldier in VietNam who is led into a raid on a village with innocent women and children.

There are major differences. Milgram goes into this a bit, especially in chapter 14 of his book, and he does not seem to want to apply his results in an overly straightforward way. Having to shoot multiple people while looking at them would certainly be harder to do, but also very difficult to replicate in the lab. Also worth noting is the tremendous pressure these soldiers were under, including the possibility of being executed if they disobeyed; whereas Milgram's subjects were not under the influence of bribes or threats, and had not gone through a lengthy conditioning process like army inductees. DanielCristofani 03:31, 4 June 2006 (UTC)

In particular, anyone interested in this topic, and especially anyone who thinks that Milgram's experiment yielded "definitive" results, should read Christopher Browning's Ordinary Men, an in-depth very insightful study of one execution unit in Occupied Poland during the Holocaust. You will see that the range of responses is quite broad.

In Milgram's experiment too. In the book he gives some case studies in two chapters called "Individuals Confront Authority". DanielCristofani 03:31, 4 June 2006 (UTC)

Although Milgram's conclusions are not completely refuted, they have to be treated much more carefully, and don't exactly "say" what they are commonly thought to say. 2. Almost all of the subjects, when asked about the experiment later on, stated that they did not have negative feelings about what was done to THEM (i.e. the "teachers"). This seems to indicate that they were never told the truth of what the experiment really was. Imagine finding out that the feelings of discomfort/anguish (depending on the individual subject) that were experienced by most of the subjects were completely unfounded, and the "trick" that was played on them. Wouldn't they be upset at how they were deceived? Most said they had positive/happy feelings about their participation; some said they felt neutral about the experiment.

This is EXTREMELY shaky reasoning. You seem to be saying that if people were not angry about being tricked, it must mean that they still hadn't been told about the trick. Now we know that they were all told about the trick, and sent a full report on the experimental method and its results, at the same time as the questionnaire at the very latest. So why did they still feel glad to have been in the experiment? ("happy" is the wrong word.)
My guess would be that in their minds, the experimental outcome outweighed the deception in importance. Milgram deceived people in order to figure out how people would act in a certain situation, not just for the fun of it, and the subjects knew that; but also, his deception provoked a startling outcome which seemed important to know about. The two salient parts of the situation were:
A. People underwent a big, detailed, stressful hoax.
B. It turns out that many ordinary people will kill a man just because a supposed "scientist" tells them to.
The second is bigger, more impressive, more surprising; people think "good to know" and the original deception fades out of importance. Imagine someone steals $1000 from you, and while looking for it you find evidence that your husband is a serial killer. You might well be glad the money was stolen.
Probably worth adding that the disobedient subjects would have extra reason to feel good about the experiment because it shows them that they would NOT kill someone in that situation; this has to be a relief and an ego boost, specially since so many people were obedient. The obedient subjects, on the other hand, might feel so much guilt that they didn't feel inclined to be mad at Milgram: "Who am I, a potential killer, to be mad at a mere liar?" And they might find it important to know that about themselves so they could work on it. Alternately, they may rewrite history so that they knew from the start that the experiment was a fake; and this again is an ego boost because it has them being so clever.
DanielCristofani 03:31, 4 June 2006 (UTC)

3. One aspect which I have never seen discussed is very intriguing, I believe. What about the estimable Stanley Milgram? What does this say about HIM? Lest readers think my question irrelevant or unsophisticated, isn't his behavior also exactly what this experiment is supposedly designed to study?

The thing about Milgram himself hurting people in the name of science is kind of ironic, yeah. This was mentioned in a 1970 article called "If Hitler Asked You to Electrocute a Stranger, Would You? Probably", and has been raised different times since. There is a minor difference in that nobody was going to actually get directly killed by Milgram's experiment, and also that he put at least some effort into taking care of his subjects...e.g. he had a psychiatrist check out some of the more vulnerable subjects a year later, to figure whether they had been screwed up by the experiment, and apparently they hadn't. But you can certainly argue that he put people through a terrible experience and risked their emotional health. I don't know that anyone is going to say Milgram is a saint. The thing is that that doesn't say anything about the value of what was learned. If we assume for a second that running the experiment was a criminal act, that doesn't mean that we can't learn anything from it, and in particular, the fact that Milgram is capable of hurting people doesn't put a dent in the fact that so many other people (as the experiment showed) are also capable of hurting people. We can't say "it's Milgram who is the bad guy, most other people are fine just like we always thought"--the most we can say is "well, Milgram is a bad guy, too." DanielCristofani 03:31, 4 June 2006 (UTC)

Think about it: A psychologist deceives individuals (telling then that this will be a study on "punishment and learning") in order to perform an "experiment," and subjects them to the anguish and/or moral conflict, by putting them in a situation where they are being "compelled" to choose between hurting someone and refusing to obey a perceived respectable "psychological authority." Obviously, as other writers have said above, a certain amount of information must be withheld in order to perform studies on people, but the question is precisely, "How much?" The notion that such an individual could go on and become a psychotherapist (I know he was not on such a track) or respected authority on human behavior is frightening, at least to me.

You're saying that a person who is willing to hurt people cannot be an expert on human behavior? I don't see why not. You can be "respected" for a variety of reasons. DanielCristofani 03:31, 4 June 2006 (UTC)

Who was REALLY tested for inflicting cruelty, Milgram or his subjects?? In this respect, the Stanford Prison Experiment is even more frightening. In response to the writer who said, "I think the experiment was a great idea," may I say that I think it is grotesque; anyone who would suggest doing such a thing to other people would not be someone I would like to spend time with. And, lest anyone think I am a delicate/soft-hearted/liberal/humanities student/etc., I was trained as a Physicist, Scientist and Historian. I do NOT have a queasy stomach. But this is revolting; is this science??

Yeah, it's science, insofar as social science can be science, and yeah, it's disturbing to say the least. But I would be a little surprised if you had been trained as a Biologist and were still feeling as strongly about this as you seem to be. And as a Historian, I would like to know how you rate this experiment compared with the other things humans have done to each other throughout history...many of which did not tell us anything interesting. DanielCristofani 03:31, 4 June 2006 (UTC)

66.108.4.183 21:47, 3 June 2006 (UTC) Allen Roth

You are wrong on almost all counts. You make one or two valid points. I will only respond to a few: If Milgram writes in his book (published years after the experiment) that the subjects were told the truth later, he is either mistaken, misrepresenting, or engaging in wishful thinking. I have read three accounts by subjects and all agreed that they were never told the truth at the time. They only found out later when news of the experiment became known. Milgram does indeed impress me as the type of "social scientist" who would remember things differently. I rate this experiment as almost totally worthless; what has it ever contributed to the study of man? Zimbardo recently testified on behalf of a soldier who abused inmates in Abou Graib prison. Do you think he should be given special consideration because he worked under stress? I am ashamed to be an American when I hear stuff like this. Rumsfeld apparently authorized certain stressful techniques in treating the prisoners; perhaps he was inspired by Milgram's "important" study? In addition, Milgram's work basically has been interpreted as awarding a license to torture and/or abuse. Regardless of whether it implies such, that has been its effect, as an "experiment." Milgram is himself responsible for analyzing the results of his own "work" improperly: The focus was put on the majority who obeyed. No attention was paid to the few that refused to obey. As for "other" experiments that humans have done "to each other" please give me citations, and I will respond. In the Physical Sciences, I do not know offhand of any experiments performed inhumanely (and if I did, my response would be that they should not have been done). We do not, for example, drop humans or animals off a tower to see how fast they fall to the ground, in order to learn about gravity. We do not test people for electrical conductivity by sending a shock through their bodies. We do not attempt to magnetise them and see if they are attracted to Iron. etc. Nor should we send people into space without paying the utmost regard for their safety; this last example I mention to meet your response "Well, you don't study human behavior in Physics." And finally, anyone who performed an experiment like this today would almost certainly be prosecuted for various crimes. That ought to tell you something. I do not regard this as legitimate science in any meaning of the word. Not even Biology or Sociology. Anything obtained unethically has no place in the scientific enterprise. If you do not see why someone who is willing to hurt people should not be considered an expert in human behavior, I am silent, except to say that one who behaves unethically in one area is liable to do so in another; are his results honestly reported? etc. In particular, your commments about his reasoning "to decide that this experiment was "worth doing" to gain scientific results are utterly revolting. I don't have to tell you where that train of thought leads. In Browning's book, you would see that many of the young SS men who did indeed refuse to shoot people were not punished IN ANY WAY. So don't state rhetorically that they were "under threat to be shot if they disobeyed orders." See--what resulted from Milgram's work is a tendency to obey, with the excuse that that is part of the human make-up. That leaves no room for courage, loyalty and honor. I mean real honor. These are also--or can be--human traits. That's all for now, and I would appreciate if you would leave my comments intact, and not carve them up. If you want to cite, please cut and paste to your own response, and make your comments that way. New readers ought to be able to follow the dialog on the page clearly, without inserting your link/name/signature between every two or three sentences in my argument. It doesn't seem that you have read the subject's comments to which I referred, based on your responses to my criticisms. It is the second external link from the bottom. While I do think that some of the person's own observations about himself are somewhat juvenile and incorrect, the account he gives is very interesting and, in my opinion, quite chilling. Again--what results of any value can you point to that followed from this "scientific discovery?" 66.108.4.183 21:59, 4 June 2006 (UTC) Allen Roth

Gosh, use a paragraph break once in a while, pal, there's no extra charge. 70.189.74.104 22:25, 4 June 2006 (UTC)
As requested, I will leave this in one piece and copy parts of it to respond to.
"You are wrong on almost all counts. You make one or two valid points."
Gee, thanks. :)
"If Milgram writes in his book (published years after the experiment) that the subjects were told the truth later, he is either mistaken, misrepresenting, or engaging in wishful thinking. I have read three accounts by subjects and all agreed that they were never told the truth at the time."
And these accounts by subjects were NOT published years after the experiment? Also, agreeing that they were not told the truth at the time is not the same as agreeing that they were not sent a full report on the experiment and its outcomes.
"Milgram does indeed impress me as the type of "social scientist" who would remember things differently."
Since this is your personal impression, I can't really address it.
"I rate this experiment as almost totally worthless; what has it ever contributed to the study of man?"
It showed that many, even most, ordinary people will follow orders from anyone perceived as a legitimate authority figure, even if that may mean hurting or killing an innocent person. That was a big surprise to psychologists and to the general public. It is a very important thing to know; it's a big problem that we need to find ways of dealing with. In later variants of the study, Milgram tried to figure out what forces lead to obedience in these circumstances, and how it could be undermined.
"Zimbardo recently testified on behalf of a soldier who abused inmates in Abou Graib prison. Do you think he should be given special consideration because he worked under stress? I am ashamed to be an American when I hear stuff like this."
He should not be given special consideration, no. Nobody who was involved in the abuse should be let off. But Zimbardo is not Milgram...you should put Zimbardo stuff in the Zimbardo article. The American government and armed forces certainly have a lot to feel guilty about these days, at least those of them who still have a sense of guilt; but again that's not relevant.
"Rumsfeld apparently authorized certain stressful techniques in treating the prisoners; perhaps he was inspired by Milgram's "important" study?"
Hmm? This seems like pure guilt-by-association, only there isn't even any association; the government's use of torture has no connection with Milgram's study. What do you even mean? Do you mean you think they got the idea of using electricity to cause pain from Milgram, and not from the numerous people who had used it previously? This is just bizarre.
"In addition, Milgram's work basically has been interpreted as awarding a license to torture and/or abuse. Regardless of whether it implies such, that has been its effect, as an "experiment.""
I don't think so. Who has interpreted it that way? I haven't seen anyone do so. Please cite sources?
"Milgram is himself responsible for analyzing the results of his own "work" improperly: The focus was put on the majority who obeyed. No attention was paid to the few that refused to obey."
He paid a lot of attention to them. He did a lot of variations trying to figure out how to help more people get into that category, and he tried to figure out what they had in common. Yes, the media and Milgram also paid a lot of attention to those who obeyed, because that was the part of the experiment that was really new and surprising. Milgram, and other psychologists, had confidently expected most people to disobey. But there's no way you can accuse Milgram of not paying attention to the disobedient subjects. Please read the book.
"As for "other" experiments that humans have done "to each other" please give me citations, and I will respond. In the Physical Sciences, I do not know offhand of any experiments performed inhumanely (and if I did, my response would be that they should not have been done). We do not, for example, drop humans or animals off a tower to see how fast they fall to the ground, in order to learn about gravity. We do not test people for electrical conductivity by sending a shock through their bodies. We do not attempt to magnetise them and see if they are attracted to Iron. etc. Nor should we send people into space without paying the utmost regard for their safety; this last example I mention to meet your response "Well, you don't study human behavior in Physics.""
I did not ask "as a PHYSICIST, compare this to other EXPERIMENTS humans done on each other". I asked, "as a self-proclaimed HISTORIAN, please compare this to other THINGS humans have done to each other". Or are you just a historian of physics experiments? I am not surprised that a physicist would be shocked by this experiment, but I am surprised that a historian would be, and as I said, I would also be surprised if a biologist were shocked by it, compared with the various things animals do to each other.
"And finally, anyone who performed an experiment like this today would almost certainly be prosecuted for various crimes. That ought to tell you something."
Not really. What gets prosecuted as a crime, and what actually hurts people, are not very well correlated. Lots of people are in jail for doing things that didn't hurt anyone, and lots of people are well paid by governments and other large organizations for doing things that do hurt people. I am not saying that Milgram didn't hurt people, although he did try to check whether he had hurt them, and was confident he hadn't. I am just saying that people hurting each other is in no way new, and Milgram's experiment at least gave us some interesting information.
"I do not regard this as legitimate science in any meaning of the word. Not even Biology or Sociology. Anything obtained unethically has no place in the scientific enterprise."
That's certainly an unequivocal statement. By "has no place" you mean you don't want to learn from it?
If you do not see why someone who is willing to hurt people should not be considered an expert in human behavior, I am silent, except to say that one who behaves unethically in one area is liable to do so in another; are his results honestly reported? etc.
For that, you would have to check his supporting data and videos. A lot of the point of science is that you don't have to just take someone's word for anything. You can check their evidence, or even replicate the whole experiment, as a bunch of people did, and got the same results.
In particular, your commments about his reasoning "to decide that this experiment was "worth doing" to gain scientific results are utterly revolting. I don't have to tell you where that train of thought leads.
That, and other similar trains of thought, along the lines "it's worth hurting some people for this good cause or that good cause", are indulged in by most people, and lead to the world we have right now. Milgram's subjects did not have worse lives than most humans. Maybe most Americans, but that's another story.
"In Browning's book, you would see that many of the young SS men who did indeed refuse to shoot people were not punished IN ANY WAY. So don't state rhetorically that they were "under threat to be shot if they disobeyed orders.""
I'm being rhetorical to say that if you're in the army in wartime, you can be executed for disobeying orders? Note I did not say "will inevitably be executed". Anyway, you missed the point. In Milgram's experiment, not only did the subjects commit less of a crime than the soldiers at, say, My Lai, but they were also under far less pressure to commit it, and yet many of them did anyway. The dissimilarity between subjects and soldiers cuts both ways.
"See--what resulted from Milgram's work is a tendency to obey, with the excuse that that is part of the human make-up."
Wow. This is a really dramatic case of shooting the messenger. One, if the tendency to obey hadn't existed long before Milgram published his results, none of his subjects would have obeyed the experimenter, besides which we would probably never have had wars and inquisitions. Two, practically everyone who hears or reads about the experiment comes out of it determined not to behave like Milgram's obedient subjects. In fact, some of the obedient subjects themselves came out of it strongly motivated to resist authority more effectively in the future. You have no evidence of the experiment making people more obedient. If anything it seems to do the opposite.
"That leaves no room for courage, loyalty and honor. I mean real honor. These are also--or can be--human traits."
Describing how most people behave dishonorably DOES leave room for honor. Diagnosing a problem is the first step toward fixing it, and figuring out which people it is most likely to affect, and in what situations, is a second step. Whereas I am getting the sense that you would prefer it if we'd never learned about this problem, and could just pretend that normal people already behave rationally and honorably in these situations, rather than figuring out how to help them do so.
"That's all for now, and I would appreciate if you would leave my comments intact, and not carve them up. If you want to cite, please cut and paste to your own response, and make your comments that way. New readers ought to be able to follow the dialog on the page clearly, without inserting your link/name/signature between every two or three sentences in my argument."
I thought, and think, it would be easier to follow the way I did it. That way people can see what is being said in response to what, and there is no danger of misparaphrasing or leaving big parts unanswered. But I have duplicated your part in deference to your wishes.
"It doesn't seem that you have read the subject's comments to which I referred, based on your responses to my criticisms."
I did read that. He is writing 43 years after the experiment, so any complaint about Milgram's book being written 13 years later applies three times as strongly to him. Notice that he freely acknowledges that Milgram mailed him the report about the experiment's design and results. His statements seem to be basically compatible with Milgram's, although I am guessing he was one of the earlier subjects, when the debriefing technique was not well figured out. He does not seem to have been damaged by the experience, either.
"It is the second external link from the bottom. While I do think that some of the person's own observations about himself are somewhat juvenile and incorrect, the account he gives is very interesting and, in my opinion, quite chilling."
"Chilling" is a bit much. It certainly doesn't sound like a shining example of experimental competence, though.
"Again--what results of any value can you point to that followed from this "scientific discovery?" 66.108.4.183 21:59, 4 June 2006 (UTC) Allen Roth"
The fact that many ordinary people will obey an authority, even when there is not that much reason to, and even if it may involve killing an innocent man. Also, lots of subsidiary facts about what circumstances make this more or less likely to happen. DanielCristofani 02:16, 5 June 2006 (UTC)

I could respond to each of your comments, but I think that our statements speak for themselves, and that any further discussion will only be repetitive; let readers decide for themselves based on the above discussion, and their independent thinking. I will merely confine myself to your last sentence. I continue to maintain that we have learned absolutely nothing of real value from this "experiment." You state we have learned that many people will obey an authority. Of what scientific or other value is that? I see none, in terms of beneficial results. Even you do not mention any concrete results or applications. Even over 40 years later, there are none. In any case, even if there were a beneficial outcome, I still maintain that scientific investigation is not exempt from ordinary moral constraints; no scientist is given license to abandon ethical principles in order to conduct his work. Consequently, experimental results obtained unethically are not to be admitted into the scientific domain. Your glib comment that if we learned anything from Milgram's experiments, they were justified, is appalling to me. Milgram's and Zimbardo's work (and I do think that they are very closely allied, ethically) fall in the same class. They are inherently offensive to human values, dignity, and honor. If one wants to investigate human behavior, do so in a legitimate and ethical manner. That is all that is required.

As I said, I will not continue an endless debate, which is already degenerating into sophistry. If you respond, you may have the last word; I don't feel that I must. 66.108.4.183 14:12, 5 June 2006 (UTC) Allen Roth

"I could respond to each of your comments, but I think that our statements speak for themselves, and that any further discussion will only be repetitive; let readers decide for themselves based on the above discussion, and their independent thinking. I will merely confine myself to your last sentence. I continue to maintain that we have learned absolutely nothing of real value from this "experiment." You state we have learned that many people will obey an authority. Of what scientific or other value is that? I see none, in terms of beneficial results. Even you do not mention any concrete results or applications. Even over 40 years later, there are none."
We also don't have an HIV vaccine yet. Describing a problem is a necessary first step, but solving it is sometimes much harder. There are some extremely powerful institutions that need people to obey without thinking too much; and it is hard to make large-scale changes to human behavior or to the social structure anyway. All the same I submit that we are better off knowing how people behave and why, regardless of whether we have enough power to solve the problems we find.
"In any case, even if there were a beneficial outcome, I still maintain that scientific investigation is not exempt from ordinary moral constraints; no scientist is given license to abandon ethical principles in order to conduct his work."
Agreed.
"Consequently, experimental results obtained unethically are not to be admitted into the scientific domain."
I don't see that that follows. Some experiments have been done that should not have been done. But throwing away solid information, in order to punish scientists who are already dead, seems not just futile but unscientific.
"Your glib comment that if we learned anything from Milgram's experiments, they were justified, is appalling to me."
See, if you had been responding point-by-point, you would have looked for a place where I made that glib comment or a similar one. Point-by-point response helps people to avoid fighting strawmen.
"Milgram's and Zimbardo's work (and I do think that they are very closely allied, ethically) fall in the same class. They are inherently offensive to human values, dignity, and honor. If one wants to investigate human behavior, do so in a legitimate and ethical manner. That is all that is required."
No problem, then. Whenever Milgram's experiment is discussed, it is generally mentioned that the same experiment could not be done today, nor Zimbardo's, etc. etc. You should be happy about the current safeguards on psychological research. This whole wanting-to-erase-Milgram-and-his-experiment-from-the-history-of-science thing has to be coming from somewhere else.
"As I said, I will not continue an endless debate, which is already degenerating into sophistry."
Best guess is you don't mean both sides are doing so equally, so this reads as a veiled insult.
"If you respond, you may have the last word; I don't feel that I must. 66.108.4.183 14:12, 5 June 2006 (UTC) Allen Roth"
Okay. Done. DanielCristofani 01:39, 6 June 2006 (UTC)

As a note, some psychologists, especially social psychologists, are calling for less stringent guidelines when it comes to deception of participants. Noone is suggesting that we go back to the days of Milgram and Zimbardo, but ethics commities are now extremly strict about any form of decption, and as such, many psychological experiments have extremly weak experimental manipulations, which leads to small effect sizes.

Removing "Parallel between the Experimenter and the Participant"

I think the sugestion of a "Parallel between the Experimenter and the Participant" is not accurate. The experimenter does not blindly submit to the "science" authority, but conciously weighs the advantages and disadvantages of performing the experiment. The way I see it, The whole point of the experiment is that, when ordered to do so, people do things they wouldn't if they decided uninfluenced.

If anyone has any reason not to delete it, please tell me. --Waltervulej 02:19, 21 Nov 2004 (UTC)

About the question of whether participants were debriefed. Yes, they were. Read Milgram's book (Obedience to Authority)for all the details. They spent a long time with each person, explaining and listening to them, and they did a follow-up questionnaire to see how participants felt. In the follow-up, the "learner" came in, and explained that he was fine and had never been hurt.

Also, re someone saying participants "knew" Yale wouldn't condone hurting someone. First of all, the experiment has been replicated in many places. And one of Milgram's variants was to do it off campus with no connection to Yale. The compliance went down a little, as a I recall--but not a lot. Also, about that same question and Nazis--wouldn't (some) people have similar faith in their government's orders? I know this is a hard experiment to swallow. But we shouldn't let our desire to maintain a belief in autonomous decision making to override the facts of this experiment. Not everyone was complicit; but almost 2/3 were. Even when the "learner" begged to be released, and said he had a heart condition, participants followed orders to increase the shocks. It's a great book, and will answer a lot of the questions posted here.

The preceding unsigned comment was added by 152.163.100.10 (talk • contribs) 4 Sep 2005.

Interpretation needs attribution

"This laughter is not sadistic, but nervous laughter that many use to help calm fears they are having." According to whom? -- Jmabel | Talk 20:45, Feb 1, 2005 (UTC)

According to Milgram, I would assume. Sadly, I don't have any sources on hand to back it up. But I've seen video's of the experiment, and I can attest that the laughter was indeed nervous, not sadistic. DaveTheRed 08:32, 15 Mar 2005 (UTC)
In the abstract to Milgram's 1963 "Behavioral Study of Obedience" in Journal of Abnormal and Social Psychology: "One unexpected sign of tension - yet to be explained - was the regular occurrence of nervous laughter, which in some Ss developed into uncontrollable seizures." It does not seem to me that Milgram thought it would help calm fears.

Suicide?

Didn't someone commit suicide as a result of participating in the experiment? I remember hearing such a thing, though I could be wrong... Dysprosia 08:08, 3 May 2005 (UTC)

I would expect to find some hits to such an occurrence on "suicide" and "milgram" in google, but haven't had any luck--mostly the hits I get are Jonestown and suicide bombers. --Tony Sidaway|Talk 10:51, 3 May 2005 (UTC)
I've never heard of this, and I seriously doubt that it is true. -- Jmabel | Talk 05:31, May 4, 2005 (UTC)


Explanation of Tony Sidaway's revert of an edit by 83.118.18.78

83.118.18.78 made an edit at 12:27 UTC changing "In the variation where immediacy of the "learner" was closest, participants had to physically hold the learner's arm onto a shock plate, which decreased compliance " to read "...increased compliance". As I understand it, the more immediate the interaction, the lower the compliance (of the participant and unwitting subject--remember the "learner" was not the real subject of the experiment and his compliance or otherwise was a fake). I have reverted. --Tony Sidaway|Talk 13:11, 5 May 2005 (UTC)

Mon oncle d'Amérique

I suppressed the reference about "Mon oncle d'Amérique". The experiment made in the film is about stress and inhibition. --Geremy78 15:53, 10 September 2005 (UTC)

Tense in "Method of the experiment"

The section "Method of the experiment" starts in past tense (describing how people were recruited) and then continues in present tense. It should all be in one tense, but I do not know if past or present is appropriate. I suspect past tense, as it all is completed, but sometimes when describing experiments present tense is used. --zandperl 23:13, 30 September 2005 (UTC)


Falsification of results

I'm just throwing this question out there: have there been any accusations made that Milgram falsified his results? I've been searching online but haven't found any confirmation of this. --Alexxx1 04:25, 14 October 2005 (UTC)

Screaming

I couldn't find any indication in the 1963 article of screaming or tape recorded feedback. The article has a section on the feedback of the student which describes the banging and non-responsiveness, but nowhere in the article did I encounter any mention of screaming. Hence I removed the references from the Wikipedia section... However, I see that Milward in the quote at the begining of the Wikipedia article talks about screaming victims, so I don't quite know what to make of that. Were the victims screaming in the original experiment? The original article suggests not, but I guess I'm really not sure. Feel free to revert if you know more about this than I do.JJM 00:59, 15 December 2005 (UTC)

In the first pilot studies, they tinkered with the feedback from the victim, strengthening it until they got measurable levels of disobedience, and it was during this time that they started using tape-recorded protests and screams. After that, Milgram did a lot of variations on the experiment, trying to figure out what conditions would produce obedience or disobedience. The wall-banging version comes from the earliest set of variants, which involved different degrees of proximity between the "teacher" and "learner": one where the learner banged on the wall; the standard version where the learner could be heard protesting and screaming; a version where the learner is there in the same room, protesting and screaming; and a version where the teacher actually had to push the learner's hand down onto the shock plate. Obedience tended to decrease with proximity, with only 30% of subjects being willing to shock the learner clear to the end when they had to push his hand down. DanielCristofani 03:11, 15 December 2005 (UTC)

"teacher" vs "subject".

There has been a move to replace "subject" with "teacher" throughout the section "Method of the experiment". This comes from a desire for consistency, but I believe it is a bad move. "Teacher" and "learner" are roles in the fictitious "experiment on the effects of punishment on learning", not in Milgram's experiment on obedience to authority. It is a subtle judgement call, but I think it is appropriate to use the words "teacher" and "learner" when the focus is on the fictional setting, and the words "subject" and "confederate" when the focus is on what is really happening.

(About "subject" vs. "participant", the APA style FAQ referenced above says "Subjects is perfectly appropriate when the person cannot him- or herself provide informed consent." This is arguably true of Milgram's experiment, and I say this as someone who thinks the experiment was a great idea.) DanielCristofani 05:08, 6 January 2006 (UTC)

predictors of defiance to the experiment?

anyone see that there is a section missing - the predictors of defiance? For instance, I know that Lawrence Kohlberg showed that principled levels of moral development were one of the only predictors that someone would stop early. There were a couple of others too. Anyone? JoeSmack Talk 14:40, 30 March 2006 (UTC)

Another experiment

I once heard about a related experiment that Milgram later conducted. As the first phase, Milgram solicited applications from students for a fictitious award. Each application was to include an essay about altruism. After some time, Milgram invited each test subject (applicant) to an interview. Upon arriving at the office on campus designated for the interview, the subject found a makeshift sign indicating that the location had been moved to another building, at the opposite side of the campus. The subject then rushed to the new location. That was all intended to set up the subject for what happened next.

Milgram placed an actor on the route from the first interview location to the second. As each subject passed by, the actor feigned a heart attack or some other life-threatening affliction. The object of the experiment was to see how many of the subjects would, upon arriving to their destination, mention the stricken person encountered along the way. Few of them did.

Does anyone here know anything else about this experiment? --Smack (talk) 03:48, 12 September 2006 (UTC)

I recall seeing this supposed experiment attributed to a nameless professor of religion, and it seems strongly influenced by the story of the Good Samaritan. That doesn't prove anything, but it makes me suspect that it's an urban legend. —Eric S. Smith 13:19, 27 April 2007 (UTC)

Reactions: Unsourced & OR

The Reactions section is unsourced and seems like one editors opinion on the significance of the experiment. All analyses of the experiment require sourced. I'll start chopping the section if no-one improves it soon. Ashmoo 04:54, 18 September 2006 (UTC)

Controversies

From: Forsyth, Donelson R. (2006) Group Dynamics Belmont, CA: Thomson Higher Education

Page 256-257

Milgram's results sparked controversies that are unresolved even today (Blass, 2000; A. G. Miller, & Brief, 1995). Some researchers believe that the participants were not taken in by Milgram's subterfuge; they knew that no shocks were being administered, but they played along so as not to ruin the study. (Mixon, 1977; Orne & Holland, 1968). Milgram's research team, however, carefully interview all the participants, and fewer than 20% challenged the reality of the situation (Elms, 1995). Moreover, if participants saw through the elaborate duplicity, then why did they become so upset?

...

The distress of the participants was so great that the publication of the study sparked a controversy over the ethics of social-psychological research (A. G. Miller, 1995). Even a museum exhibit that featured the Milgram experiment sparked public debate over its ethics when it toured the U.S. science museums (C.Marsh, 2000)

--84.48.194.249 17:17, 7 October 2006 (UTC)

If you have the Forsyth book, can you list full citation details on the sources it relies on, so they can (hopefully) be located on PubMed or elsewhere? Sandy 17:25, 7 October 2006 (UTC)

My thoughts

In the first paragraph, there are 4 terms used to describe what seems like 3 participants in the experiment. Confederate(which means an accomplice), experimenter, victim and participant. Can someone confirm that the victim (an irish-american accoutant that acted the role as the victim) is the confederate? It is quite confusing beacuse it makes it seem as if the confederate and the victim are 2 different people. Can anyone make this more user-friendly?

On another note I think the diagram should be in the section "method of the experiment" rather than in the introduction. Tremello22 20:57, 29 November 2006 (UTC)

Why Germans obeyed orders to murder Jews and others

Milgram found that

Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority. [1]

Jewish people in WWII Germany heard rumors of the Holocaust but couldn't believe their fellow human beings could commit the crime of genocide. They had no idea people could be so cruel.

Milgram discovered that people can be cruel, even when not forced to be. Imagine how much more cruel we can be made to be, when we ourselves are threatened with torture or death.

I don't think this excuse Eichmann or the Nazis. It shows, rather, how far the human race has yet to go, if we are to rise above the moral level of predatory animals. --Uncle Ed 18:36, 4 December 2006 (UTC)

Just a frequent reader of wikipedia here, but it seems some smartaleck has posted/attached 3 pictures of a penis or penile implant at the very bottom of the Milgram experiment article. I'll leave it to the powers that be to correct it. JJC152.131.10.193 14:13, 9 December 2006 (UTC)