Facebook Is, Indeed, Toying With Your Emotions

Posted by Stephanie Mau

on July 4, 2014

illustration facebook thumb up like icon with question markIf Only There Was a ‘Dislike’ Button

Do you ever have one of those days where scrolling through the successes and triumphs of your Facebook friends just gets you down?

Turns out, Facebook knows you have those days. In fact, out of scientific curiosity, Facebook manufactured some of those days for their users back in 2012.

Aware of their users’ tendency to be emotionally influenced by the type of material on their feeds, Facebook has revealed that it conducted an A/B experiment on a subset of their users two years ago. That is, in this experiment, one subgroup of Facebook users was subjected to a more positive content mix on their news feeds, while another subgroup of users was subjected to a more negative one. Almost 700,000 users were involved in this little test, which lasted a week.

Was This Ethical?

Well, depends on who you ask.

The users involved were not told that they would be taking part in such an experiment, and in order for social experiments to be considered ethical, informed consent is usually a requirement.

Especially since, as the results published in this month’s issue of Proceedings of the National Academy of Science stated, their study seemed to work. The users who faced more positive news feeds were more likely to produce upbeat posts, while the users who had more negative news feeds were slightly more likely to produce negative posts.

However, according to Facebook’s terms of service, this type of ‘research’ was perfectly allowed. Their data use policy states that: “We may use the information we receive about you for internal operations, including trouble-shooting, data analysis, testing, research and service improvement.”

So supposedly, this type of experiment is just the price you pay when you sign on to Facebook. But they still made a significant amount of their customers experience negative emotions on purpose, and that is pretty obviously unethical.

What If It Was in the Workplace?

While Facebook’s experiment in toying with their users’ emotions may be in something of a grey area, intentionally invoking negative emotions in others is unquestionably not okay in the workplace.

As we reported a month ago, bullying is a prevalent problem in the American workplace, and can often result in employees feeling anxiety, depression, and even post-traumatic stress.

Interestingly, another study that Facebook is conducting speaks to the problem of cyberbullying on their website. In dealing with user posts that may insult or embarrass other Facebook users, Facebook is encouraging users to speak up for themselves. To foster this, the social networking website is now giving their users options to express how they feel about negative posts, with such templates as “it’s embarrassing” or “it shows inappropriate behaviour.”

In the workplace, however, it may not be as easy to speak up to your colleagues and superiors about bullying and ostracism.

Having a company like WhistleBlower Security on your side where employees can report on any type of behaviour, including ostracism, goes a long way to ensuring all of your employees are experiencing the same type of job satisfaction. Our tools enable the reporting of any type of wrongdoing and your employees can feel secure that whatever they feel they need to report on, they can do so with complete confidentiality and anonymity.

Whistleblower Security is committed to promoting a culture of integrity, collaboration and transparency for all our employees and clients. With a 24/7/365 whistleblower hotline, employees can be assured that all of their ethical concerns will be heard and addressed.

eBook: 5 Steps to Create a Whistleblower Culture

Source: 

http://money.cnn.com/2014/06/30/technology/facebook-mood-experiment/index.html?iid=EL

http://money.cnn.com/2014/07/01/technology/social/facebook-compassion-research/index.html?iid=EL

http://money.cnn.com/2014/07/02/technology/facebook-experiment-probe/index.html?iid=SF_T_River