Should we worry that technology companies can secretly influence our emotions? Apparently so.
A study recently published by researchers at the University of California, San Francisco, Cornell and Facebook suggests that social networks can manipulate the emotions of their users by tweaking what is allowed into a user's news feed. The study, published in the Proceedings of the National Academy of Sciences, changed the news feeds delivered to almost 700,000 people for a week without getting their consent to be studied. Some got feeds with more sad news, others received more happy news.
The researchers were studying claims that Facebook could make us feel unhappy by creating unrealistic expectations of how good life should be. But it turned out that some subjects were depressed when the good news in their feed was suppressed. Individuals were not asked to report on how they felt; instead, their writing was analyzed for vocabulary choices that were thought to indicate mood.
The researchers claim that they have proved that "emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness." The effect was slight, but imposed on a very large population, so it's possible the effects were consequential to some people. The paper itself states its claims rather boldly, but one of the authors, Adam D. I. Kramer of Facebook, responding to intense criticism that it was wrong to study users without their permission, has since emphasized how tiny the effects were. But however the results might be interpreted now, they couldn't have been known in advance.
The manipulation of emotion is no small thing. An estimated 60 percent of suicides are preceded by a mood disorder. Even mild depression has been shown to increase the risk of heart failure by 5 percent; moderate to severe depression increases it by 40 percent.
Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook's generic click-through agreement, which almost no one reads and which doesn't mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.
This is only one early publication about a whole new frontier in the manipulation of people, and Facebook shouldn't be singled out as a villain. All researchers, whether at universities or technology companies, need to focus more on the ethics of how we learn to improve our work.
To promote the relevance of their study, the researchers noted that emotion was relevant to human health, and yet the study didn't measure any potential health effects of the controlled manipulation of emotions.