I've been teaching Stanley Milgram's electric-shock experiment to business school students for more than a decade, but "The Experimenter," a movie out this week about the man behind the famous social science research, illuminates something I never really considered. In one scene, Milgram (played by Peter Sarsgaard) explains his experiment to a class at Harvard: A subject, assigned to be the "teacher," is ordered to administer increasingly intense shocks to another study participant in the role of "learner," allegedly to illustrate how punishment affects learning and memory. Except, unbeknown to the subject, the shocks are fake, the other participant works for the lab and the study is about obedience to authority. More than 60 percent of subjects obeyed fully, delivering up to the strongest shock, despite cries of pain from the learner. Those cries were prerecorded, but the teachers' distress was real: They stuttered, groaned, trembled and dug their fingernails into their flesh even as they did what they were asked to do.
"How do you justify the deception?" one student asks. "I like to think of it as an illusion, not deception," Milgram counters, claiming that the former has a "revelatory function." The student doesn't buy it: "You were delivering shocks, to your subjects, psychological shocks … methodically for one year."
Before seeing the film, I didn't fully appreciate that parallel. In the grainy, black-and-white documentary footage that the real-life Milgram produced, he remains off-camera. I'd never put much thought into the moral dilemma he faced. I'd never asked myself what I would have done in his position.
I'm fairly certain that — even in an era before institutional review boards, informed-consent and mandatory debriefings — I would have determined that it's wrong to inflict that much psychological distress. But I can't be absolutely sure.
When I ask students whether, as participants, they would have had the courage to stop administering shocks, at least two-thirds raise their hands, even though only one-third of Milgram's subjects refused. I've come to refer to this gap between how people believe they would behave and how they actually behave as "moral overconfidence." In the lab, in the classroom and beyond, we tend to be less virtuous than we think we are. And a little moral humility could benefit us all.
• • •
Moral overconfidence is on display in politics, in business, in sports — really, in all aspects of life. There are political candidates who say they won't use attack ads until, late in the race, when they're behind in the polls and under pressure from donors and advisers, their ads become increasingly negative. There are chief executives who come in promising to build a business for the long term, but then condone questionable accounting gimmickry to satisfy short-term market demands. There are baseball players who shun the use of steroids until they age past their peak performance and start to look for something to slow the decline. These people may be condemned as hypocrites. But they aren't necessarily bad actors. Often, they've overestimated their inherent morality and underestimated the influence of situational factors.
Moral overconfidence is in line with what studies find to be our generally inflated view of ourselves. We rate ourselves as above-average drivers, investors and employees, even though math dictates that can't be true for all of us. We also tend to believe we are less likely than the typical person to exhibit negative qualities and to experience negative life events: to get divorced, become depressed or have a heart attack.