Getty Images / Justin Sullivan

Facebook Study Doesn't Silence Filter-Bubble Criticism

Researchers for Facebook published a study suggesting personal preferences have more effect on what viewers see than the network's algorithms.

By Jasmine Bailey | May 8, 2015

Do Facebook's algorithms affect what we see on our timelines? 

Some have questioned whether the site creates a so-called "filter bubble," the idea that Facebook's algorithms guess what you want to see on your timeline, leaving a lot of important, more newsworthy information to fall by the wayside.

So to tackle the debate, researchers from Facebook published a study in the journal Science. It focuses primarily on diversity among political news. (Video via Facebook)

Article Continues Below

But according to the findings, Facebook's algorithms aren't to blame. The authors write, "We conclusively establish that on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content."

But critics are skeptical, pointing to holes in the findings — for one, the number of user accounts researchers examined. A very small percent of Facebook's 1.2 billion users were studied. The authors only used accounts where users specifically mentioned their political affiliation.

And others say personal preferences and Facebook algorithms go hand in hand.

Sociologist Nathan Jurgenson is among those questioning the findings. (Video via Netnografia)

In The Society Pages, he explained: "Individual users choosing news they agree with and Facebook's algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend."

In other words, you need to know user preferences in order to create a personalized algorithm.

Research blogging site Social Media Collective says the study is a way for Facebook to divert blame in what it calls "The Facebook 'It's not our fault' study."

And even if other researchers wanted to review the findings, Fortune points out it can't be reproduced because only Facebook researchers have access to the user information examined for the study.

So according to critics, the study did nothing to help Facebook prove a point.

This video includes images from Getty Images. 

Want to see more stories like this?
Like Newsy on Facebook for More Tech Coverage