In terms of recent news that generated outrage, few stories in the past month can compete with the Facebook “Mood Manipulation” Experiment. If the story escaped your notice, here’s the basics: A study conducted by Facebook’s data team filtered 689,003 users’ News Feeds for positive or negative keywords. The test was to see what impact this had on the users’ subsequent posts. Needless to say, users with only positive Feeds were more likely to say something positive. Negative Feeds led to more negative posts.
The study, titled “Experimental evidence of massive-scale emotional contagion through social networks,” can be read here.
As the story spread, most people’s immediate reactions were the same. “That’s illegal, right? You need to give consent to take part in a psychology experiment.” Well, yes you do. And in the most technical sense, every Facebook user already has. There’s a line in Facebook’s Terms of Service about agreeing to “data analysis, testing, [and] research.” As angry as the story made people, Facebook is covered from a legal standpoint.
Now that the dust from the outrage has mostly settled, it’s time to take a step back and survey the damage. At the end of the day, what did Facebook and the field of psychology get out of manipulating hundreds of thousands of people?
Not much of anything. The findings were statistically significant, but only just. I suppose Facebook gets to walk away with proof that it can mess with people’s moods, but I’m not really sure what can be done with that information.
No, believe it or not, as outraged as millions of Facebook users are, they’re the ones who have the most to gain from this story. Not because Facebook is going to do anything differently, but because it sparked debate. The best thing about this study aren’t the findings, it’s the conversation around it. That’s come down to three things:
1. Conversations about consent
It’s good that the worst to come out of the study was a few people ending up with bad moods, because the unchecked taking advantage of users tends to get worse before it gets better. Just like the Milgram experiment or the Stanford prison experiment, any time a psychological test pushes the limits of what we consider morally acceptable, it incites a conversation about what can and can’t be done in the name of science.
Psychological studies are often the toughest to define in terms of consent, since in most cases, a subject’s complete awareness of what is going on will taint the results. Subjects need to trust that the experimenters know what they’re doing and that they’re complying with all legal and ethical regulations.
In an age that’s so reliant on clicking the little box that says “I agree” before signing up, it’s good that we’re finally taking about how much a simple click really counts as “consent.”
2. Figuring out where we draw the line on data collection
People don’t think about it often, because when you think about it too much it starts to get scary, but the wheels of the business world are greased by your personal data. Advertisers and marketers are constantly collecting information to figure out how to send their messages in the most cost effective way. It can be a bit unsettling, but it’s typically benignly anonymous. Businesses rarely care about your information on a personal level, they just need some feedback to see when, where, and how to spend their money. Most people know that and are fine with it.
Nobody’s upset that Facebook collected data for this study. They’re upset that the data was collected through manipulation. If this exact same study was published by observing the positive/negative News Feed effect passively, there would be no story. Data collection often gets a bad rap, but this controversy indicates that it’s the threat of data manipulation that really bothers users.
3. The rare glimpse of Facebook’s humility
There is one more exceptional thing about this story. It’s worth noting that, for one of the only times in recent memory, Facebook has acknowledged that maybe, just maybe, they did something unpopular. After catching flak for a few weeks, the company issued a statement saying, “It’s clear that people were upset by this study and we take responsibility for it. We want to do better in the future and are improving our process based on this feedback.”
It might be going a little too far to call this an apology, but hearing Zuckerberg & co. admit a mistake is so rare it’s almost cryptozoological.