Facebook was right to run its test

Lately, there has been an uproar over Facebook’s study on emotional contagion in a large number of users’ feeds. The study seemed to indicate that Facebook had the ability to make people happier or sadder – that is, manipulate their emotions – by manipulating the content of their feeds.

Lots of people don’t like the idea of being unwitting lab rats for experiments, or at the prospect of a corporation having the ability to control our thoughts and emotions, or just the danger of conducting such a study:

In my mind, however, these arguments rest on a faulty assumption, which is that Facebook isn’t already manipulating our emotions, at least by accident. Indeed, the studies done on this topic seem to suggest the opposite; Facebook already impacts its users’ emotions significantly. (onetwo)

In addition, Facebook has to use some algorithm to show content to its users. I think it’s a mistake to assume that there is a “neutral” scenario, especially when all evidence is to the contrary.

Then this issue really becomes a question of understanding what’s already happening, not about doing something different qualitatively different, and it’s clear to me that in this frame Facebook should be conducting experiments like this (within limits, obviously).

The take by some, though, is that it’s not the experiment itself that’s troubling, but that it was conducted without its users knowing:

But again, if there really is no neutral scenario, and your emotions are going to be manipulated in any sort of consumption of Facebook, then even before any intentional tests have been done, you’ve already opted in to having your emotions manipulated. Indeed, you opted in when you started using Facebook, and every time you added something to your Facebook universe, via a like or added friend, you increased the scope of what could impact you.

So I’m glad Facebook is doing this research, and I’m really glad they’re publishing it for external consumption.