© 2024 Blaze Media LLC. All rights reserved.
Creepy Study Shows Facebook Can Tweak Your Moods Through 'Emotional Contagion
In this Wednesday, Feb. 12, 2014, Facebook software engineer Brielle Harrison demonstrates expanded options for gender identification at her company's Menlo Park, Calif., headquarters. Harrison, who helped engineer the project, plans to switch her identifier to "Trans Woman." (AP Photo/Noah Berger)

Creepy Study Shows Facebook Can Tweak Your Moods Through 'Emotional Contagion

"Facebook apparently manipulates people's News Feeds all the time.”

The good news: a study found evidence that good news seems to make you feel better.

The creepy news: if you have a Facebook account, you could have been an unwitting research subject in that study — or you could be one now.

The study, "Experimental evidence of massive-scale emotional contagion through social networks," included nearly 700,000 Facebook users as subjects and the results were published in the journal Proceedings of the National Academy of Sciences of the United States of America.

By analyzing three million Facebook posts over the course of a week in January 2012, a trio of researchers found that "emotional contagion" seems to be a real, measurable phenomenon; that is, when people are exposed to negative or positive emotions from others, they tend to reflect those emotions as well.

The study was pulled off through manipulation of News Feeds on Facebook.

A member of the media takes pictures of Facebook CEO Mark Zuckerberg as he speaks at an event at Facebook's headquarters office in Menlo Park, California, on January 15, 2012. Today, Facebook announced the limited beta release of Graph Search, a feature that will create a new way for people to navigate connections and search social networks. Credit: AFP/Getty Images A member of the media takes pictures of Facebook CEO Mark Zuckerberg as he speaks at an event at Facebook's headquarters office in Menlo Park, California, on January 15, 2012. AFP/Getty Images

Facebook uses an algorithm to determine what pops up on users' News Feeds (since most users couldn't keep up with every single piece of content posted by friends and the pages they follow), so for the study the Facebook-blessed researchers tweaked the algorithm to make some users see more positive emotional content in their News Feeds, and they made some users see more negative content.

The researchers wanted to see if viewing certain emotional expressions would lead users to, in turn, post statuses expressing those same emotions — and that's exactly what they found.

"...for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive," the researchers wrote. "When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods."

The researchers also found that people who saw less emotional content overall were less emotionally expressive in their status updates.

But how did the researchers get the consent of three-quarters of a million participants?

Those people had already agreed to participate, just by signing up for Facebook.

The experiment "was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research," the researchers noted.

The researchers might be technically correct, but many have questioned the ethics of the study, considering that many Facebook users likely have no idea they could be subjected to experiments at random.

“I was concerned,” Princeton psychology professor Susan Fiske told the Atlantic, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time.”

So Facebook can tweak your moods just by tweaking what you see in your News Feed — a victory for scientific understanding with some really creepy ramifications.

Of course, as a recent viral video demonstrated, it's possible your friends are presenting a false front online in the first place, which could make Facebook's manipulation a little less scary and more absurd.

Follow Zach Noble (@thezachnoble) on Twitter

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?