A brain study helps explain the John Edwards types among us.
Is John Edwards a criminal or merely a sleazebag of breathtaking proportions? The jury couldn't quite make up its mind.
But various reasonable questions come to mind in the wake of the mistrial. For example: Whoa, close call, can you believe that this guy might have been president? And: Why didn't he have a shred of willpower when it comes to honesty and doing the right thing?
The spectacle reminds me of something moving about honesty I read recently, in, of all places, a scientific journal.
In a 2010 report in the prestigious Proceedings of the National Academy of Sciences, Harvard psychologists Joshua Greene and Joseph Paxton asked a great question: When people are confronted with the opportunity to lie, what differs in the brains of people who succumb to the temptation and those who don't?
For the study, each subject was placed in an MRI machine, a scanner that indicates the ongoing levels of activity in different brain regions. The volunteers had a simple task. There'd be a series of virtual coin tosses by a computer, and before each one, the subject had to predict the outcome. Guess right, and there'd be a financial reward.
But there was a twist. Subjects were told a great piece of nonsense, namely that the purpose of the study was to determine whether people had better paranormal powers at predicting the future when the predictions were made in private.
To examine this, scattered through the series of coin tosses would be the occasional instance where instead of a subject entering the prediction before the toss, he would privately make his prediction. Then, after the toss, he'd be asked: So, did you guess right? In other words, people were given the opportunity to lie.
Coin tosses being what they are, predictions could be expected to be correct roughly half the time. If the success rate skyrocketed when there was the opportunity to cheat, odds were that there was a liar in the brain scanner.
To start, here's some demoralizing news -- fewer than half of the people were in the clear-cut-honest range, with success rates remaining around 50 percent when they had the chance to cheat.
About a third seemed to be lying often enough that their success rates were well above 50 percent at those times. The remaining subjects had success rates that were somewhere in between, and thus hard to classify.
What went on in the brains of people when temptation beckoned? Let's start with the people who lied with sufficient frequency that they could be detected statistically. Central to the results was a region called the prefrontal cortex, or the PFC.
This is one interesting part of the brain -- it's all about self-discipline, gratification postponement, emotional regulation, control of impulsiveness.
It's the part of the brain that makes you do what's hard to do when it is the right thing to do. It's bigger and more complex in humans than in any other species, is our most recently evolved brain region and is the last part of our brains to fully mature.
So when the opportunity to cheat arose, the activity in the PFCs of liars shot up like crazy. The scans showed the trace of an epic moral battle -- do it, don't, yes do it, no don't -- that the liars lost.
And what were levels of activity in the PFCs of those who, from a statistical standpoint, never lied? Greene and Paxton present two differing views in moral philosophy about honesty: Is honesty an act of will?
Does it require a person working hard to refrain from doing the wrong thing? Or is it an act of grace, effortless because temptation isn't tempting? In the study's paradigm, it was grace all the way -- among the unequivocally honest, there was no increase in PFC activity when the chance to cheat arose.
In the face of real life's temptations, a majority of us are not going to get by on pure grace. We ooze our human frailties.
Yet there are those who glide through minefields of enticement, doing the difficult, rare, brave, correct thing as naturally as breathing. It can seem hard to believe that a person could really be this way.
But a high-tech brain scanner documented that it's possible. It's an achievable goal. And should be. Even for someone who would be president.
Robert M. Sapolsky is a professor of neuroscience at Stanford University and the author of "A Primate's Memoir," among other books. He wrote this article for the Los Angeles Times.
The Opinion section is produced by the Editorial Department to foster discussion about key issues. The Editorial Board represents the institutional voice of the Star Tribune and operates independently of the newsroom.