A few days ago, a paper published by members of Facebook’s Data Science Team revealed something that many users of the network had probably already felt – that the social network’s algorithms, by altering what content users see, can insulate them from opposing views. In this case, researchers were studying the right/left divide in American politics and - surprise - people identifying with one camp saw less content from the other.
However, the differences were not as drastic as one may think. And the results do come with a caveat: While machines may filter out some politically unsavoury content, it’s still the user who’s primarily to blame for staying inside the bubble.
How do we get better stories to bubble up in front of us, and avoid insulating ourselves from the outside world?
- "Exposure To Ideologically Diverse News And Opinion On Facebook" – Science
- "Facebook Says Its Algorithms Aren't Responsible For Your Online Echo Chamber" – The Verge
- "Don't (Just) Blame Facebook: We Build Our Own Bubbles" – ArsTechnica
- "Did Facebook's Big New Study Kill My Filter Bubble Thesis?" – Medium
- "The May Challenge" – Medium
- "Wired Wrists: News In The Age Of Wearables" – Medium