Do you think of yourself as open-minded? For a 2017 study, scientists asked 2,400 well-educated adults to consider arguments on politically controversial issues — same-sex marriage, gun control, marijuana legalization — that ran counter to their beliefs. Both liberals and conservatives, they found, were similarly adamant about avoiding contrary opinions.

When it came to same-sex marriage, for example, two-thirds of those surveyed passed on a chance to pocket money if, in exchange, they took some time to just look at counterarguments, never mind seriously entertain them.

The lesson is clear enough: Most of us are probably not as open-minded as we think. That is unfortunate and something we can change. A hallmark of teams that make good predictions about the world around them is something psychologists call "active open mindedness." People who exhibit this trait do something, alone or together, as a matter of routine that rarely occurs to most of us: They imagine their own views as hypotheses in need of testing.

They aim not to bring people around to their perspective but to encourage others to help them disprove what they already believe. This is not instinctive behavior. Most of us, armed with a Web browser, do not start most days by searching for why we are wrong.

As our divisive politics daily feed our tilt toward confirmation bias, it is worth asking if this instinct to think we know enough is hardening into a habit of poor judgment. Consider that, in a study during the run-up to the Brexit vote, a small majority of both Remainers and Brexiters could correctly interpret made-up statistics about the efficacy of a rash-curing skin cream. But when the same voters were given similarly false data presented as if it indicated that immigration either increased or decreased crime, hordes of Brits suddenly became innumerate and misinterpreted statistics that disagreed with their beliefs.

A separate study by Yale professor Dan Kahan and colleagues found the same phenomenon in the U.S. when voters were given made-up data about skin cream and gun control.

But researchers such as Kahan and colleagues have also found a reason for hope, a personality trait that seemed to war with biased judgment: They called it science curiosity.

Science curiosity is different from science knowledge. Science-curious folk always chose to look at new evidence, whether it aligned with their beliefs or not. Less science-curious adults became more resistant to contrary evidence and more politically polarized as they gained subject matter knowledge.

This should not be entirely surprising. University of Pennsylvania psychologist Philip Tetlock made a similar finding in a 20-year study that tested the ability of experts to make accurate predictions about geopolitical events. The results, in short, showed that the average expert in a given subject was also, on average, a horrific forecaster. Their areas of specialty, academic degrees and (for some) access to classified information made no difference. Some of the most narrowly specialized experts actually performed worse as they accumulated credentials. It seemed that the more vested they were in a worldview, the more easily they could always find information to fit it.

There was, however, one subgroup of scholars that did markedly better: those who were not intellectually anchored to a narrow area of expertise. They did not hide from contrary and apparently contradictory views, but rather crossed disciplines and political boundaries to seek them out.

Tetlock gave the forecasters nicknames, borrowed from a well-known philosophy essay: the narrow-view hedgehogs, who "know one big thing" (and are terrible forecasters), and the broad-minded foxes, who "know many little things" (and make better predictions). The latter group's hunt for information was a bit like a real fox's hunt for prey: They roam freely, listen carefully and consume omnivorously.

Eventually, Tetlock and his collaborator, Barbara Mellers, assembled a team of foxy volunteers, drawn from the general public, to compete in a forecasting tournament. Their volunteers trounced a group of intelligence analysts who had access to classified information. As Tetlock observed of the best forecasters, it is not what they think but how they think. They argue differently; foxes frequently used the word "however" in assessing ideas, while hedgehogs tended toward "moreover." Foxes also looked far beyond the bounds of the problem at hand for clues from other, similar situations.

A reasonable conclusion is that curiosity — and a broad range of knowledge — might be a kind of superpower. Hedgehog experts have more than enough knowledge about the minutiae of an issue in their specialty to cherry-pick details to fit preconceived notions. Their deep knowledge works against them. More skillful forecasters depart from a problem to consider completely unrelated events with structural commonalities — the "outside view." It is their breadth, not their depth, that scaffolds their skill.

David Epstein is the author of "Range: Why Generalists Triumph in a Specialized World," from which this is adapted. He wrote this article for the Washington Post.