Imagine your 18-year-old daughter is decapitated in a car accident. Gruesome police photographs of her body are leaked onto the Internet. Every time someone searches your family's name, the photos pop up at the top of the page. That's what happened to Christos and Lesli Catsouras because in the United States, unlike in Europe, search engines are not required to act on requests by individuals to remove such links.

That's why our nonprofit consumer group has petitioned the Federal Trade Commission to grant every American "the right to be forgotten," a position the Washington Post criticized in an Aug. 28 editorial ("Stifling the Internet") for potentially opening the door to the purging of "unflattering" links upon request. We believe that families such as the Catsourases should have the right to ask the Internet's corporate gatekeepers to stop elevating deeply disturbing, unauthorized, irrelevant, excessive or distorted personal information to the top of search results associated with their names.

Extending the right to be forgotten to Americans would not mean that government would limit freedom of expression, as the Post suggested. True suppression of speech happens when a government reviews all media and suppresses those parts it deems objectionable on moral, political, military or other grounds. With a right to be forgotten, Google, Yahoo and other corporations — not the government — would decide what material should not be provided in response to search requests, while the material would still remain on any websites that posted it.

Google may be battling this right in the United States, but in Europe it has shown that it is perfectly capable of separating the wheat from the chaff. Google reports that it has evaluated more than 310,000 requests to remove more than 1.1 million URLs. It has removed about 42 percent and left 58 percent alone.

The sorts of requests that Google had denied involve people who want embarrassing, but still relevant, information excised from the Web. For example, Google did not remove links to recent articles reporting on the arrest and conviction of a Swiss financial professional for financial crimes. He's still in that business, so those who might deal with him should know. Google denied a request from a man in Britain to remove references to his dismissal for sexual crimes committed on the job. Such information is relevant to his next employer.

Requests that Google has honored also make sense. A rape victim in Germany asked it to remove a link to a newspaper article about the crime. A woman in Italy asked for the removal of links to a decades-old article about the murder of her husband in which her name was used. Google rightly complied as the widely accessible information victimized individuals all over again.

Such readily accessible material can be devastating, unjustly foreclosing economic and social opportunities. The more prominent the result, the more credible, accurate and relevant it can seem, even if the opposite is true.

For example, a Florida doctor locked herself into a bedroom to avoid a violent boyfriend. After he jimmied the lock with a knife, she scratched his chest with her fingernails. He told police she had used the knife on him. Police arrested them both and charged her with aggravated assault with a deadly weapon. The charges against her were soon dropped, but she had to pay thousands to websites to remove her mug shot.

A middle-aged school guidance counselor disclosed the fact that she modeled lingerie in her late teens when she was hired, but she still was fired after the photos surfaced on the Web. It made no difference that the photos were irrelevant to her job.

U.S. law already recognizes that certain information should become irrelevant after the passage of the time has demonstrated that an individual is not likely to repeat a mistake. The Fair Credit Reporting Act, which is enforced by the FTC, dictates that debt collections, lawsuits, tax liens and even arrests for criminal offenses in most cases be considered obsolete after seven years and so excluded from credit reports.

This concept is not lost on Google. When a teacher in Germany who was convicted of a minor crime over 10 years ago contacted the company, it removed links to an article about the conviction from search results for the individual's name. But public figures are a different matter. When a high-ranking public official asked to remove recent articles discussing a decades-old criminal conviction, Google declined.

Google touts its privacy principles, claiming that it strives to offer its diverse users "meaningful and fine-grained choices over the use of their personal information." It's deceptive and unfair for Google to make this claim but not to honor the privacy it purports to protect.

Google makes money off online searches. It has an obligation not to exploit or appropriate the salacious details of peoples' lives in the pursuit of clicks and money without considering petitions to have such details removed.

The Catsouras family and others have the right not to be traumatized forever by images or information that never belonged in the public domain. They deserve the right to bury the past and move on. Google's refusal to answer the family's pleas without a law in place compelling it to do so shows exactly why the FTC needs to act.

Liza Tucker is a consumer advocate with the nonprofit group Consumer Watchdog. She wrote this article for the Washington Post.