Do you think of yourself as open-minded? For a 2017 study, scientists asked 2,400 well-educated adults to consider arguments on politically controversial issues — same-sex marriage, gun control, marijuana legalization — that ran counter to their beliefs. Both liberals and conservatives, they found, were similarly adamant about avoiding contrary opinions.
When it came to same-sex marriage, for example, two-thirds of those surveyed passed on a chance to pocket money if, in exchange, they took some time to just look at counterarguments, never mind seriously entertain them.
The lesson is clear enough: Most of us are probably not as open-minded as we think. That is unfortunate and something we can change. A hallmark of teams that make good predictions about the world around them is something psychologists call "active open mindedness." People who exhibit this trait do something, alone or together, as a matter of routine that rarely occurs to most of us: They imagine their own views as hypotheses in need of testing.
They aim not to bring people around to their perspective but to encourage others to help them disprove what they already believe. This is not instinctive behavior. Most of us, armed with a Web browser, do not start most days by searching for why we are wrong.
As our divisive politics daily feed our tilt toward confirmation bias, it is worth asking if this instinct to think we know enough is hardening into a habit of poor judgment. Consider that, in a study during the run-up to the Brexit vote, a small majority of both Remainers and Brexiters could correctly interpret made-up statistics about the efficacy of a rash-curing skin cream. But when the same voters were given similarly false data presented as if it indicated that immigration either increased or decreased crime, hordes of Brits suddenly became innumerate and misinterpreted statistics that disagreed with their beliefs.
A separate study by Yale professor Dan Kahan and colleagues found the same phenomenon in the U.S. when voters were given made-up data about skin cream and gun control.
But researchers such as Kahan and colleagues have also found a reason for hope, a personality trait that seemed to war with biased judgment: They called it science curiosity.
Science curiosity is different from science knowledge. Science-curious folk always chose to look at new evidence, whether it aligned with their beliefs or not. Less science-curious adults became more resistant to contrary evidence and more politically polarized as they gained subject matter knowledge.