While Trump was busy making  the filter bubble a mainstream problem, we were developing https://facebook.tracking.exposed. We can’t solve a political problem with technology but we might at least provide reliable data to help address the phenomenon objectively. The goal of the project is to collect the stories that make your Facebook Newsfeed, and to aggregate and analyse the data collected in order to study the relation between the filter bubble effects and how the personalization algorithm (PA) operates. Everything we do is open source so you can see how it works and help make it better. Filter bubble and fake news proliferation are a side-effect of PA. As 2016 shows us, PA can have a certain impact in our society. We treat algorithms as nothing less than social policies. But social policies have to be debated openly and not in a technical meeting intended to optimize revenues. That’s why we promote a collaborative study of PA.

Recently we have heard much about how Facebook and Google can be “regulated”, but civil society lacks a reliable tool to judge what PA are actually doing.

Facebook.tracking.exposed wants to remedy this. Our vision is to increase transparency behind personalization algorithms, so that people can have more effective control of their online Facebook experience and more awareness of the information to which they are exposed.

Our mission is to help researchers assess how current filtering mechanisms work and how personalization algorithms should be modified in order to minimize the dangerous social effects of which they are indirectly responsible and to maximize the values, both individual and social, that algorithms should incorporate.

In this session, I shall present the browser extension, how the supporter can get beneficial insights from it, and how algorithm auditors can use it to run their analysis.

Photos