Problem
Lots of your social networks are echo chambers these days.
Solution
According to Facebook and a study they published in Science, it isn’t their algorithm that only shows you stuff that you agree with—you choose your friends that way.
Method
First, Facebook anonymized data for ten million Facebook users who identified themselves as liberal or conservative, as well as seven million URLs for news stories shared from July 7 to January 7. Then, they used software to identify URLs about hard news stories shared by at least twenty users with listed political affiliation, and Facebook labelled the story as liberal, conservative, or neutral, depending on who shared it.
Results
Only twenty-four per cent of the stories shared by liberal users were aligned with conservative users, and only thirty-four per cent of the stories shared by conservative users aligned with liberal users—which means these users were only exposed to ideas from the other side of the spectrum thirty per cent of the time.
On top of that, Facebook found its algorithm only reduces user’s exposure to content from friends on the opposite side of the political spectrum by less than one percentage point.
Further, when it comes down to actual clicks, conservatives were seventeen per cent less likely to click on liberal content and liberals were six per cent less likely to click on conservative articles.
The Takeaway
First of all, take these findings with a grain of salt; after all, Facebook isn’t an unbiased source here.
That said, don’t get your news from Facebook. Pick five or six diverse sources, visit them at the top of the day, and let Facebook be a source of cat pictures, funny videos, and world-class men’s lifestyle content. Ahem.
Something else worth considering: read a paper or site from the other side of the political aisle. Not a tabloid—the object isn’t to find a hate-read here. Pick a good one. Doesn’t hurt to know how the other side sees things. Besides, you might learn something.