053: Bursting The Bubble

A few days ago, a paper published by members of Facebook’s Data Science Team revealed something that many users of the network had probably already felt – that the social network’s algorithms, by altering what content users see, can insulate them from opposing views. In this case, researchers were studying the right/left divide in American politics and - surprise - people identifying with one camp saw less content from the other.

However, the differences were not as drastic as one may think. And the results do come with a caveat: While machines may filter out some politically unsavoury content, it’s still the user who’s primarily to blame for staying inside the bubble.

How do we get better stories to bubble up in front of us, and avoid insulating ourselves from the outside world?