Once, we came of age under the shadow of something called a Permanent Record. Nobody ever actually saw one, but as youngsters we understood that we had to keep our own clean, since stains could do lasting damage.

Plainly, the idea of an authoritative, ineradicable ledger on individual behavior is a powerful one. Widespread, too. You see it in everything from the divine Book of Life to the gift list kept by Santa, who knows if you've been bad or good.

That permanent record meant somebody was paying attention, which was good, but it was also a dark and oppressive background presence, since it enabled even trivial sins to curse our futures.

Good thing it was largely mythic. Back then, actual record-keeping was spotty, and technology had zero ability to corral the manifold tracks that we each left into some all-knowing compendium.

No longer. Welcome to the digital age. Its mighty search engines have spawned a virtual permanent record for millions of individuals. What gets in it and with what prominence — those are mysteries, depending on the alchemy of particular search engines. Generally, they suck up most anything that was published or resides in Internet-accessible public records. (They don't scour social media like Facebook, yet.)

That means the fraternity house dust up that led to a sleepover in jail, or the rude remark at a political rally, or any of a thousand missteps and embarrassments that in a pre-modern age would have faded into oblivion now remain vivid, alive and, potentially, toxic.

Hence the importance of last month's ruling by Europe's highest court. It authorizes people to demand that links to material that threatens their privacy be scrubbed from search results.

The case involved Google, the California-based colossus that handles roughly 90 percent of Europe's Internet searches. It was brought by a Spaniard who challenged a link to a 1998 item in a Catalan newspaper about the auction of his home, repossessed to repay debts he owed. He reasoned that the matter had been resolved ages ago and that there was no reason people who Googled his name now should learn about it.

The European Court of Justice agreed, and ruled that if an individual complains, a search engine should investigate. If it determines the link is to information that is "inadequate, irrelevant or no longer relevant or excessive," it should delete the listing from the search results.

Matters of public significance were excluded. Nor did the judgment question publishing the offending material in the first place: It applied only to "data controllers," not to news media — and links to the material might still be available through Google, just not when you're searching under the individual's name.

Nevertheless, the ruling is being called a landmark, and much of the commentary focuses on the "right to be forgotten," which underlies legal protections in such countries as France, Italy and Britain. The idea is that at some point misdeeds — even criminal behavior — become part of one's personal past and shouldn't be exposed publicly without good reason. That isn't merely a concession to individual feelings; it reflects a cultural preference for letting people move on with their lives, a belief that penitent wrongdoers shouldn't have red letters emblazoned on their foreheads. That's very different from the U.S. affection for punishment in perpetuity.

In the first four days after the judgment was issued, 12,000 Europeans flooded Google with demands. Still, the judgment is troubling: It erects a barrier to truthful information, which is abhorrent to U.S. tradition. It also reflects a bizarre accommodation: Nasty items remain intact in cyberspace; they're just rendered irretrievable.

It also, as a British parliamentarian put it, "forces Internet search engines to police what should and shouldn't be wiped from public view without any clear criteria — let alone ones determined by democratically elected lawmakers."

Besides, is forced ignorance the proper response to the specter of the toxic Permanent Record? Isn't the problem not one of what people know, but what they make of it?

I sympathize with people whose pasts are rummaged for worldwide gaze by search machines run by algorithms that even their creators can't explain, let alone justify.

But deliberate concealment is a disturbing response. Perhaps the answer lies in making sure people can respond prominently to prejudicial postings. And in the longer term, the challenge is to nurture an online culture with qualities of clemency and compassion that are as mature and far-reaching as the vast awareness the Internet enables.

Edward Wasserman is dean of the University of California, Berkeley, Graduate School of Journalism. He wrote this article for McClatchy Newspapers.