Facebook managed to rock its own boat over the weekend when it was revealed that the social networking site had conducted an experiment on affecting users’ emotions. The aim was apparently because Facebook was afraid that people would stop visiting the site if they were exposed to too many negative emotions. Of course, it turns out that seeing negative posts actually does cause people to share their own negative thoughts; although the same also applies for positive statements appearing in the news feed.
Some 689,000 users were subject to the experiment where Facebook altered the algorithm on the news feed to show them only negative or positive status updates. Naturally, a control group was used to see what would happen in a more neutral situation without outside interference.
The revelation has raised questions about the ethics of the experiment itself, and whether Facebook had the right to manipulate its users in such a way. There is also the concern that Facebook did not inform the users who participated in the study (even after it was conducted), which goes against international guidelines for performing this sort of experiment. Calls for future experiments to at least meet this minimum standard of care will probably get Facebook to at least rethink how it conducts any more studies on users.
Although it should be noted that the social media giant claims that we have all agreed to be part of this experiment under the terms of service agreement. That may be true, but hardly anyone has read the whole thing through and probably shouldn’t qualify as informed consent.
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.