For the past couple of days, the tech press has been in an uproar from the news initially published in the AV Club that Facebook tinkered with users’ feeds for a massive psychology experiment in 2012. The money quote from the article is below

It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.

The strange thing about the recent uproar is that the focus of the anger seems to be that Facebook ran the experiment. This is strange if you actually stop and think about what we actually know as humans.

1. People are influenced by what they see including what they see on social networks like Facebook. Remember all those, "Facebook makes you sadder" headlines from a year or two ago? How about the fact that just yesterday, the MayDay PAC raised $5 million from almost 50,000 people thanks to viral sharing on social media sites by people like George Takei? These are thousands of people being influenced to spend money to change how their government works based on what they saw in their news feed.

2. Facebook controls what you see in your news feed.

The second point can’t be emphasized enough. Remember when Facebook explicitly spelled out how Edgerank works?

Over the past few years, Facebook has made hundreds of tweaks to the news feed. Some we notice and others we don’t. The above image was from an article explaining one such tweak which caused posts by brands to start showing up much less in the news feed. Over the past few years Facebook’s news feed tweaks have caused our feeds to be filled with too much and then over time very few quizzes and polls, Zynga games like Mafia Wars & Farmville, articles my friends are reading from social readers, videos from social video sites like Viddy, Bitstrips comics and of course, Upworthy articles to name a few.

For each of these waves of content dominating our news feeds, some product manager decided to turn up or turn down the dial of said content based on our “engagement” with Facebook. There is no outside party vetting these changes nor is there even a way for such an interested party to even tell what these changes are. It is quite unprecedented in the history of the world for any entity (company or government) to control so much of the media that millions of people see daily without any visibility into its agenda or the content it is feeding to its subjects.

Most people who are still bloviating on this topic on Techmeme are upset that Facebook “manipulated people’s emotions without any oversight for an experiment” when the reality is that Facebook manipulates people’s emotions via tinkering with the news feed to increase their engagement (i.e. time spent on the site looking at ads) every minute of every hour of every day. 

That’s why Sheryl Sandberg gave this shrug as she responded that the major problem with the experiment is that it was poorly communicated. She’s right. Facebook does this every day. Manipulating your behavior by manipulating your news feed is their primary business. If anything, this experiment should be commended because it implies Facebook had at least at one point considered the impact of this manipulation on the psychological health of its users and wanted to understand it better.

Speaking of lack of oversight and transparency, one can’t help but wonder what subtle dampeners or viral boosts Facebook puts on sharing of content depending on the politics of the situation. For example, it’s interesting that George Takei posts still garner hundreds of thousands of likes each time they show up when other Facebook pages are seeing double digit percentage declines. With other media like Fox News or the Wall Street Journal, their agenda is understood by all and quite clear. On the other hand, Facebook editing which content from your friends or brands that you see, is driven by an unknown agenda while masquerading as serendipitous and organic content.

Maybe Facebook doesn’t manipulate your feed depending on politics. Maybe it did at one time then stopped. Maybe they will in the future. We don’t know and if it ever does happen we won’t even realize it.

So go ahead and freak out about one A/B test in 2012. That totally seems like the most worrisome thing about Facebook’s power over its users.

Note Now Playing: Rick RossNobody (featuring French Montana) Note


 

Categories: Social Software