In A Bubble? Change Your Friends!

11057202_1443605225952369_540304459_nPolitical scientists and social theorists have long fretted about the Internet’s potential to flatten and polarize democratic discourse. Exposure to news and opinion increasingly occurs through social media. How social networks choose to filter and personalize our news feed influence exposure to perspectives that cut across ideological lines. Can we trust the algorithms they use? Or are they hiding relevant information from us?

In the “The Filter Bubble”, Eli Parisier argued that news-filtering algorithms could significantly narrow what we know, surrounding us in information that tends to support what we already believe. On May 7 2015, Science published a study(1) by Facebook employees which puts part of the filter bubble theory to the test by examining what content you do (and don’t) see on Facebook’s news feed:

Our latest research, released today in Science, quantifies, for the first time, exactly how much individuals could be and are exposed to ideologically diverse news and information in social media. (“Exposure to Diverse Information on Facebook”)

How did the “filter bubble” theory hold up? The author’s take away is that Facebook news feed algorithm does not prevent users from seeing opinions they disagree with:

The composition of our social networks is the most important factor affecting the mix of content encountered on social media with individual choice also playing a large role. News Feed ranking has a smaller impact on the diversity of information we see from the other side.

In other words: the filter bubble is your own damn fault. If you’re not seeing content on Facebook that challenges your personal views, it’s because of your own choices, not algorithm’s.

11057101_1440059879642133_703903570_nBacklash to the study happened overnight. Scientists have criticised that the study was conducted on a small, skewed subset of Facebook users who chose to self-identify their political affiliation, and authors are trying to minimize the impact of Facebook algorithms:

The most important finding, if you ask me, is buried in an appendix. Here’s the chart showing that the higher an item is in the newsfeed, the more likely it is clicked on (…) You live and die by placement, determined by the newsfeed algorithm. (“How Facebook’s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks”)

Interestingly, Parisier remarks:

… this is good science, but because it’s by Facebook scientists, it’s not reproducible.”

This isn’t the first time Facebook research has angered the academic community.

There is something fishy here.

___________________

(1) Bakshy, Eytan, Solomon Messing, and Lada Adamic. “Exposure to Ideologically Diverse News and Opinion on Facebook.” Science, May 7, 2015, aaa1160. doi:10.1126/science.aaa1160.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.