The Computer is Your Redacted
Overview
The Computer is Your Redacted is a terrifying and ridiculous audio play where you can listen as four experts in artificial intelligence, technology, and culture play the classic role-playing game Paranoia! and navigate a horrible and occasionally hilarious dystopia. What happens when a business leader, an artist, an ethicist, and an academic come together to explore artificial intelligence through a role-playing game that embodies our collective anxieties about an automated future?
As part of the Goethe-Institut Toronto’s Algorithmic Culture programming, we created a rather unusual podcast. We brought together four brilliant individuals connected to AI and its ethics/policy implications to be take part in an audio series/podcast. The format was not a traditional one, however. Rather, we invited them to play. Specifically, to play a tabletop role-playing game, a slightly modified version of the 1980s classic Paranoia!. We were seeking a fun and playful way to surface themes around AI while bypassing the polarized positions so often present in these discussions.
What resulted was absurd and revealing.
Role playing games can be understood as both play and performance. A podcast was produced to document this virtual ‘campaign’ among invited experts over the course of a very long Saturday in May 2021. We wanted to see how role playing might be a model for anti-algorithmic, improvisational, instantiated, and empathetic responses to the actual and potential hegemony of algorithmic culture.
Algorithmic culture refers to the process whereby the logic of big data and computation comes to change how culture is perceived and experienced. In May 2020, Glen Weyl and Jaron Lanier argued in Wired, that artificial intelligence (AI)’ is “best understood as a political and social ideology rather than as a basket of algorithms”. This ideology holds that a small technical elite can and should develop technologies that will replace human agency and judgment. This ideology of AI understands individuals as objects requiring optimization and control. And John Cheney-Lippold, author of We Are Data, offers that “when our embodied individualities get ignored, we increasingly lose control not just over life but over how life itself is defined”.
Many are concerned that machines will exceed human intelligence. We should also be troubled by the idea that humans are being asked to behave more like machines in service to the liberating ambitions of scale and efficiency. Rather than creating an algorithm that thinks like a human, the goal, it increasingly appears, is to automate us. Notions of autonomy, identity, and community are being reworked under the influence of artificial intelligence.
However, we should avoid blaming the nature of algorithms for the current situation. Technology is inevitably an expression of the objectives of the broader culture. The work we assign our algorithms to do reflects the things we value: the questions we ask them to solve, the data we feed them, and how we choose to use them.
The objectification of human subjects will have profound and material impacts outside of the digital. “The Computer is Your Redacted” explored these impacts on the body, on landscapes, and through different experiences of diaspora. The possibility of defying a unitary logic of utility rests in reconnecting to lives, landscapes, and cultures as subjects in dialogue with each other and the world.
Ethics provides us with generalized rules and principles. Theories – scientific or ethical – are abstracted, and The Computer is Your Redacted was is an invitation to reconnect concepts to the ontological roots of being.
AI is driven by a spoken or unspoken need to understand people and the world as ‘finalized’ objects of an automated process. By placing subjects into these relational structures of interpretation we become answerable for what happens next. We undersign ourselves to the decisions, actions, and speech conducted in our name.
The current approach to imagining and representing AI prioritizes the scaffolding of previous interpretations. A speaker tells us that an automated vehicle failed to recognize Black faces in its operation. We already hold that this injustice is unacceptable, and we argue for greater vigilance in the creation and regulation of these automated systems. At no point are we provided a direct encounter with a material event. Rather than embodying or transforming our perspective, we are asked to apply our existing perspectives to the issue at hand. In many cases this is sufficient. However, the changes underway too often escape the boundaries of our existing sets of experiences. Perhaps the current listlessness in engaging with opportunities and threats of AI is less a product of indifference or hypocrisy and more the inaccessibility of the ‘event’ under discussion.
By centering a dialogic approach to materializing our relationships with AI, diverse perspectives can be brought into relationship with one another to build a dynamic and polyphonic structure of representation. This suggests a preference for discursive (or ‘novelized’) forms of artistic response.
We saw this project as driving and facilitating dialogic and genuine encounters among multiple socialized consciousnesses