Emotionally disconnected: The world is way scarier than Facebook

Sam Court and Tim Smith
By Sam Court and Tim Smith | 15 July 2014
 
Tim Smith and Sam Court

At some stage, most of us have experienced 'emotional contagion' - the idea that humans synchronise their personal emotions with those expressed by others around them.

A recent study by Facebook and Cornell University, called Emotional Contagion, sought to prove that this concept also holds true in online social networks. And they did. The more ‘happy’ updates we see, the more likely we are to express happy thoughts ourselves. A significant step in verifying the new ways that the connectivity provided by the internet influences our feelings every day.

What Facebook was apparently unable to foresee, was the public’s response to discovering that the research had taken place. A quick search on Twitter exposes the outrage: “super disturbing”, “creepy” and “evil”, to quote just a few.

As a result, government organisations are preparing their responses and the Electronic Privacy Information Centre (EPIC – what an acronym) has already filed a complaint with the US Federal Trade Commission. Apparently this has “purposefully messed with people’s minds” in a “secretive and non-consensual” study and is a horrific example of how society – and Facebook in particular – has gone badly wrong.

But here’s a reality check folks: this study didn’t explore anything fundamentally different from a normal day using the internet, nor did it lie to participants in order to manipulate them.

They’re all at it

Ever since people started building websites, they’ve been tracking your activities. They might not know your name or telephone number (though many do), but at the very least, they know where you are, what device you’re using, how often you visit and which pages you look at. And they use that information to manipulate what you see and measure how you react.

Everyone’s doing it. They’re all watching and testing you. All of them.

But whether we realise it or not, we love the internet for making it possible. It’s thanks to this constant cycle of ‘test and learn’ that we get more intelligent, more targeted and more personalised content.

Google Now and the exciting new G Watch epitomises this with the pre-emptive Android Wear notifications. The watch warns the wearer when to leave to catch their plane, because it’s read their flight confirmation email, found their location using the phone’s GPS, and mixed all of that personal data with publicly available travel information. Awesome! And by leveraging studies like Emotional Contagion, our devices will know more about how we’re feeling and be able to tailor the internet to each of our individual needs.

What defines ‘news’ anyway?

Here’s a newsflash: an algorithm has always driven your Facebook newsfeed. On some days you see more sad posts than happy ones. But where are the complaints about that?

In the study, Facebook didn’t show anything fake; they just juggled things around a bit. Your mate Mark really did share a picture of a sick puppy, and your sister did post a grumpy comment about commuters. And sorry, you weren’t shown that picture of Dave’s steak dinner that 21 people ‘liked’. But all those activities in your network actually did happen.

And how does this differ to the wider media? Is it unethical for traditional news channels to post tragic (but newsworthy) articles about global atrocities, just because that information might make readers feel sad?

The truth is that everything influences us: from what we had for breakfast, to the weather on the weekend. And this study was no different.

But is Facebook actually evil?

We’re not suggesting that users have no right to be unnerved – these are scarily unregulated times for data collection. But it does seem naïve to be angry with Facebook specifically, as this study didn’t do anything that isn’t regularly being done by governments, corporations and other social media channels.

So why have so many responded so negatively? Well, simply because it’s Facebook. Even with all of Zuckerburg’s social nous, to many, Facebook has become just another untrustworthy corporation – and one that has become difficult to avoid.

Despite this apparent distrust, Facebook isn’t pure evil. In many ways they are constantly striving towards a smarter way in which we digest information from our family, friends, and content providers. Like Google, the better experience they provide, the more engaged their users remain.

In an attempt to cool the heated commentary about the study, Facebook’s COO Sheryl Sandberg apologised for the way the study was carried out, stating: “It was poorly communicated. And for that communication we apologise.”

Facebook may well have the very best of intentions, but every time it hides an obscure clause in its terms and conditions, or it changes its newsfeed algorithm, or it dares to redesign its service, users just get more and more annoyed.

The social media giant has a lot to learn in terms of PR. They were clearly proud of their findings, and had every intention of releasing the outcomes of this research to the public. But they’re obviously out of touch with how their users feel about the use of their 'private' data.

Creating contagion through studying contagion

So how can Facebook avoid these kinds of media disasters in the future? The obvious option is that they need to learn from the emotional contagion phenomenon itself. Rather than creating negative emotional reactions around another privacy scandal, perhaps they could go on the front-foot and give people control over their data.

Brands like Facebook and Amazon already engage users to test design releases before they become publicly available. So why not give users the chance to specifically opt in to studies like this, and then reward them posthumously when they do?

Ultimately, we all benefit from the intense debate that surrounds this study. But instead of focusing such venom directly at Facebook, we should shift our attention to improving the ethical oversight in research and data privacy practices.
Lines separating research from reality are now almost invisible, so we need to address the challenges holistically.

Organisations must become accountable for their manipulations, regardless of whether they’re labelled ‘research’ or ‘commerce’.

Sam Court and Tim Smith
UX director, and senior social strategist and head of community management
The White Agency

comments powered by Disqus