What should we worry about? Scan the headlines, and the answers seem obvious. We should worry about Congress and the debt ceiling, about gun violence and climate change, about terrorism, the euro, schools, taxes, entitlement programs and Kim Kardashian's sunburn.

But how confident are we that we're worried about the right things? History shows that our concerns are often misguided or conditioned on outdated assumptions. Ask 100 motorists the colors of a "yield" sign, and most will say yellow and black, even though they've been red and white since 1971. But if we still believe in yellow and black despite all evidence to the contrary, what other assumptions will lead us astray? Our endless fretting over Y2K didn't stop terrorists armed with box cutters.

So, what really should we worry about? It's the official question John Brockman posed this year to his jury of top intellectuals. Brockman is the über literary agent, cultural impresario and best friend to the world's smartest people. He runs edge.org, a science/arts salon with lofty ambitions: "To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together; and have them ask each other the questions they are asking themselves."

These are people who, presumably, don't sweat the small stuff. Brockman's question drew 150 short essays from among the salon's 660 vetted contributors. Concerns about runaway viruses and Chinese eugenics made the cut, as did a handful of glib commentaries about the perils of worrying. But the overarching theme was easy to spot:

We should worry about the interplay between humans and technology.

How, for example, is the computer changing our bodies, minds and social relationships? Do we have the mental capacity to properly analyze the enormous flow of data that drives our decisions? Can we depend on the judgments of search engines? Will information technology stunt human curiosity?

Does the Internet's ability to customize information make us more parochial in our thinking? Will narrow, self-interested thinking destroy democracy? Is there an "idiocracy" in our future? Why are smart people shunning politics?

Will technology continue to widen the wealth gap between haves and have-nots? Will it widen the cultural and intelligence gaps? By causing turmoil in the lower and middle ranges of the economy, will technology reignite fascism? That is, will it elevate self-interest and devalue empathy to the point that demagogues can flourish?

Are our systems -- Internet, financial markets, energy, etc. -- truly stable, or does their growing complexity make them more vulnerable? Does our dependence on complex systems make catastrophes more likely? Does the interlocking nature of systems make cascading crises more likely?

Does the Internet devalue the written word? If writing decays, will thinking also decay? Do humans still have the attention span required to solve problems? Can we reliably separate the trivial from the significant? Can we tell fact from fantasy?

Those are some of the things smart people worry about, with psychologist Susan Blackmore offering this cheery conclusion: "We are unwittingly, but eagerly, outsourcing more and more of our manual skills to machines. Our minds are losing touch with our bodies and the world around us, and being absorbed into the evolving technosphere."

But if it's true that we're becoming machines, maybe we can program ourselves not to worry so much about it.


An editorial of the Star Tribune, Minneapolis.