video

Wendy Hui Kyong Chun: The Rage of Data

Talk in English from October 2022

Wendy Hui Kyong Chun in Conversation with Nishant Shah – Moderation: Clemens Apprich

About the series Decolonizing Technology

Participants:

Wendy Hui Kyong Chun, Director of the Digital Democracies Institute, Simon Fraser University, Canada

Nishant Shah, Chair Professor Aesthetics and Cultures of Technology, ArtEZ University of the Arts / Radboud University, The Netherlands

Moderated by Clemens Apprich, Head of the Departmentent Media Theory, University of Applied Arts Vienna

Reception by Gerald Bast, Rector of the University of Applied Arts

In her new book Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal – not an error – within big data and machine learning.

These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Chun, who has a background in media studies and cultural theory as well as systems design engineering, will talk about her book with Nishant Shah. The conversation will revolve around the question of how big data and machine learning encode discrimination, create agitated clusters of comforting rage and call for alternative algorithms to foster a more democratic future.

How can we release ourselves from the vice-like grip of discriminatory data?

Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.

Wendy Chun has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.

Nishant Shah is a feminist, humanist, technologist working in digital cultures. He wears many hats as an academic, researcher, educator and annotator, interested in translating research for public discourse and being informed by public discourse to orient his research.

In his official capacity, he is Chair and Professor of Aesthetics and Cultures of Technologies at ArtEZ University of the Arts (NL), Endowed Professor in Cultural Studies at the Radboud University (NL), as well as Knowledge Partner to Oxfam Novib (NL) and the Digital Asia Hub (Hong Kong/ Singapore). His work is at the intersections of body, identity, digital technologies, artistic practice, and activism. His current interest is in thinking through questions digital narrative practices towards building inclusive, diverse, resilient, and equitable societies.

The event is part of the series Decolonizing Technology in cooperation with the department of Media Theory