Stage 1
16:15 - 16:45
Live Translation
We Can Work It Out
An approach to adversarial research (Video-in)

Short thesis

The use of data-driven algorithmic systems to run our lives has become commonplace. It is also becoming increasingly clear that they don’t work equally for everyone. Interrogating these systems is challenging because they are usually protected by terms and conditions that keep their code opaque and their data inaccessible to outsiders. So how do you fight injustice if you can’t see it? One approach is to find the stories of who these systems harm rather than focusing on how they work.


In today’s digital world social, economic, and racial injustice lurks in the shadows of the unseen Facebook post, the hidden algorithm used to sort employment resumes, and the risk assessment tool used in criminal sentencing.  These systems tend to be opaque and beyond scrutiny. Access is usually restricted to large companies and governing bodies whose interests are often unaligned with large parts of their customer base and citizens. Much of the criticism of the technology industry tends to be hypothetical or speculative because it can be very difficult to measure the ways in which people are being harmed. The methods of personalization that have transformed how we use the internet have also obfuscated the disparate impact that takes place there. This makes it significantly harder for those interested in regulation to collect the evidence necessary to hold tech companies accountable. It is possible to collect some of this information by harnessing the network and communications infrastructure that the internet is made up of. The data traveling through these systems tell compelling stories if you know how to look for them. They also often reflect systemic biases and prejudices prevalent in society.