topic

Decolonizing Technology

What does it mean to decolonize technology, and how can it be that technology generates a colonizing practice in the first place?

‘Obviously, no technological innovation will be able to solve our social problems.’ (from our Interview with Clemens Apprich)

TOPIC CONTENT:

Fabric of Dreams. Towards a Technodiversity

Katharina Klappheck: The Joy of Being an Error

The Rage of Data

In Conversation with Clemens Apprich on ‘Decolonizing Technology’

The Trouble with Visibility

The Ecstasy of Noise

‘And this might be an opening for us: if we think about technologies as being a part of us, as reflections of ourselves, then we can finally engage with them properly.

At the heart of machine learning, which mostly catches our eye in the form of algorithmic recommender systems, is the simple logic of sifting through big chunks of data and turning them into information. This requires a particular logic or pattern that reflects our social behavior. After all, it is not the technologies that are male and white but the ideas they are fed.

[…] algorithms learn from our data and therefore adopt everything that is part of it, including our racist, sexist, classist and ableist prejudices as well as heteronormative concepts. We are thus confronted with self-fulfilling prophecies, which are then relabeled as objective decision-making processes.’*

Together with the department of Media Theory, we present lectures, talks and performances, mainly from the field of digital cultures, circling around the topic of ‘Decolonizing Technology’

With Wendy Chun, Nishant Shah, Hito Steyerl, Ramon Amaro, Tiara Roxanne, Katharina Klappheck, Mary Maggic, Tung-Hui Hu

*Excerpts from the Interview with Clemens Apprich, head of the Department of Media Theory, find the whole Interview below

exhibition

Opening: 16 Mar 2023, 19:00

Running: 17 Mar 2023 – 12 May 2023

Fabric of Dreams. Towards a Technodiversity

Curated by Elisabeth Falkensteiner and Clemens Apprich

A cooperation with the department of Media Theory, concluding the series of talks and performances on ‘Decolonizing Technology’

Eating Experience by Marianna Mondelos

Opening concert by Rojin Sharafi

About the series ‘Decolonizing Technology’

Today, the prospects of digital technologies are both auspicious and frightening: With digital assistance and whole new virtual environments, as well as promising advances in AI and machine learning on the one hand, and surveillance capitalism, discriminating datasets and life-threatening cyberwars on the other. How did this conflicting situation come about? To answer this question, we have to acknowledge that technologies are temporally and spatially produced, affected by different epistemes, ideologies, political interests, economic forces and cultural practices.

Accordingly, technological development is always fragmented.

Image by © kennedy+swan, Delphi Demons, videostill, 2022
Image by © kennedy + swan, Delphi Demons, videostill, 2022

The starting point for the exhibition ‘Fabric of Dreams: Towards a Technodiversity’ – which is the culmination of a semester-long lecture series – is the assumption that weal and woe of digital media technologies have always been two sides of the same coin. Rethinking today’s techno-scientific model with its extractivist and increasingly violent logic calls for concepts capable of moving beyond the flawed idea that universal technological solutions are the answers to our social problems; and, conversely, the assumption that our political distortions are merely induced by technological developments. Instead, we need to lay bare the complex – and often contested – relations we entertain with our machines.

To do so, we need to genuinely engage with diverse understandings of technology and explore the in-between spaces of our socio-technical situation.

The exhibition raises the question of how we can escape this singular vision of technology and its rather definitive configuration as a fully automated machine intelligence. It therefore investigates new openings, narratives, and potentials by pursuing a two-fold goal: it seeks to address the challenges current technological transformations pose to art and artistic practices, and, at the same time, it also asks how those practices can in turn challenge the technological status quo.

Ultimately, the underlying concern is how art can contribute to the project of building a variety of futures, a true technodiversity.

Image by © Christiane Peschek, Oasis, videostill, 2022

In addition, the exhibition serves as a platform for discursive and performative interventions in technological discoveries and inventions. For this purpose, it will be structured around three key issues:

(i) the (prosthetic) body and disembodied experiences in the digital realm;

(ii) multiple ways of being in the world that challenge the human condition of intelligence and sentience; and

(iii) techno-ecologies and the synthetic foundation of organic matter.

The exhibition will engage with diverse and relatively unknown perspectives and narratives in order to better understand the challenges posed by our technological present – and to produce critical and promising visions of its multifaceted future.

Artists:
Christina Gruber
Cyrus Kabiru
kennedy+swan
Mary Maggic
Kumbirai Makumbe
Christiane Peschek
Luiza Prado de O. Martins
Anna Vasof
Christian Freude/Christina Jauernik/Johann Lurf/Fabian Puttinger/Rüdiger Suppin*

Program:

Save the Date!

4 May 2023

Loose Threads
Additional Discourse

Please note:

The exhibition is closed on 30 Mar and from 6–11 Apr

* Research Project: Unstable Bodies
Institute for Art and Architecture
Academy of Fine Arts Vienna
led by Wolfgang Tschapeller

  • Project funded by :
    Austrian Science Funds FWF AR574

  • Extended project team:
    Vicki Kirby (University of New South Wales)
    Thomas Lamarre (University of Chicago)

  • Images of the glasshouse with kind permission of:
    Ingeborg Lang, Andreas Schröfl, Thomas Joch University of Vienna, Faculty of Life Sciences, Department of Functional and Evolutionary Ecology

Preview Image: Cyrus Kabiru, Wearing Miyale Ya Blue, 2021

video

Katharina Klappheck: The Joy of Being an Error

Lecture by Katharina Klappheck followed by a dialogue with Katta Spiel from Jan 2023 in German

The argument will be that disability is a condition of possibility for AI.

Part of the series Decolonizing Technology

Artificial Intelligence (AI) as digital infrastructure raises the question of its mode of production:

Who is creating seemingly invisible technologies? What are the material foundations? Who is paying for their service?

Katharina Klappheck want to explore these and other questions in their lecture from a disabled perspective.

The argument will be that disability is a condition of possibility for AI.

This can be traced back to the beginnings of scientific disciplines and their contemporary configurations.

Accordingly, disability represents the limits of AI. It symbolizes its failings as well as its supposed achievements as a transhuman artefact.

This ambivalence results in moments of oppression, but it also holds subversive potential. The chance of a different world is a critical point of refuge for reflections on alternative design and politics of digital infrastructures.

Katharina Klappheck, M.A., is a disabled political scientist. Their research interests include disability, queerness and AI as well as the politics of design. Katharina Klappheck completed her Master's dissertation on the gender binary and automated facial recognition systems under the supervision of Professor Barbara Prainsack at the University of Vienna.

They previously worked at the German Hygiene Museum as part of an interdisciplinary research team, analyzing accessibility and knowledge hierarchies regarding AI. At the Technical University of Dresden, Katharina Klappheck came up with a playful concept for a hackathon which envisioned the democratic digitalization of university administration.

They then went on to work on political equality issues, such as digitalization in terms of gender equality, at the German Bundestag. Currently, Katharina Klappheck is Head of Feminist Internet Policy at the Gunda Werner Institute at the Heinrich Böll Foundation where they are creating a Crippled Low Tech Lab.

Katta Spiel research marginalised perspectives on technology. Their work informs design and engineering in critical ways to support the development of technologies that account for the diverse realities they operate in. The research is situated at the intersection of Computer Science, Design and Cultural Studies. Drawing on methods from (Critical) Participatory Design and Action Research, Katta Spiel collaborates with neurodivergent and/or nonbinary peers in conducting explorations of novel potentials for designs, methodological contributions to Human-Computer Interaction and innovative technological artefacts.

video

The Rage of Data

Talk in English from October 2022

Wendy Hui Kyong Chun in Conversation with Nishant Shah – Moderation: Clemens Apprich

About the series Decolonizing Technology

Participants:

Wendy Hui Kyong Chun, Director of the Digital Democracies Institute, Simon Fraser University, Canada

Nishant Shah, Chair Professor Aesthetics and Cultures of Technology, ArtEZ University of the Arts / Radboud University, The Netherlands

Moderated by Clemens Apprich, Head of the Departmentent Media Theory, University of Applied Arts Vienna

Reception by Gerald Bast, Rector of the University of Applied Arts

In her new book Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal – not an error – within big data and machine learning.

These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Chun, who has a background in media studies and cultural theory as well as systems design engineering, will talk about her book with Nishant Shah. The conversation will revolve around the question of how big data and machine learning encode discrimination, create agitated clusters of comforting rage and call for alternative algorithms to foster a more democratic future.

How can we release ourselves from the vice-like grip of discriminatory data?

Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.

Wendy Chun has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.

Nishant Shah is a feminist, humanist, technologist working in digital cultures. He wears many hats as an academic, researcher, educator and annotator, interested in translating research for public discourse and being informed by public discourse to orient his research.

In his official capacity, he is Chair and Professor of Aesthetics and Cultures of Technologies at ArtEZ University of the Arts (NL), Endowed Professor in Cultural Studies at the Radboud University (NL), as well as Knowledge Partner to Oxfam Novib (NL) and the Digital Asia Hub (Hong Kong/ Singapore). His work is at the intersections of body, identity, digital technologies, artistic practice, and activism. His current interest is in thinking through questions digital narrative practices towards building inclusive, diverse, resilient, and equitable societies.

The event is part of the series Decolonizing Technology in cooperation with the department of Media Theory

text

In Conversation with Clemens Apprich on ‘Decolonizing Technology’

About media technology and its political implications in terms of identity politics and the importance of critical thinking in digital cultures. What does it mean to ‘decolonize’ technology, and how can it be that technology generates a ‘colonizing’ practice in the first place?

Clemens Apprich is head of the department of Media Theory, which co-hosts AIL’s Talk Series on the topic of ‘Decolonizing Technology’

Read more on the subject (in German)

Image by © Foto: Universität für angewandte Kunst / Adam Berry, transmediale

Clemens Apprich

is full professor and head of the department of Media Theory at Angewandte. His current research focuses on filter algorithms and their use in data analysis procedures as well as machine learning methods. He is author of Technotopia: A Media Genealogy of Net Cultures (Rowman & Littlefield International, 2017) and co-editor of the book Pattern Discrimination together with Hito Steyerl, Florian Cramer and Wendy Chun (University of Minnesota Press, 2018)

Elisabeth Falkensteiner (Curator and Co-Head of AIL) talked to Clemens Apprich about media technology and its political implications in terms of identity politics and the importance of critical thinking in digital cultures.

Elisabeth Falkensteiner (EF)

In recent times, the political visions or, rather, the naïve utopian dream of a fundamental democratization of the internet has been dulled just as the hope has faded that artificial intelligence and machine learning will develop ‘neutral’ digital worlds.

In fact, new technologies are even more likely to entrench inequality, reinforce racist and sexist tendencies and promote reactionary identity politics – which will have consequences in virtual space as well as in real life. After all, most 90s online cultures had their roots in activism and subcultures.

Where did we go wrong?

Clemens Apprich (CA)

Many technologies that constitute what we call the online world were created in the 90s – at a time when the future of the internet was still undecided and subject of passionate debates. The mass distribution of so-called web 2.0 applications, building on the initial internet infrastructure and known as ‘social media’ platforms today, has led to a shift in the balance of power: away from public, social and artistic positions towards commercial interests. The use of these new media platforms has also led to a staggering increase in digital data. However, an enormous amount of computational power is required to generate ‘valuable’ information from this data, something only global corporations can afford these days.

And these automated pattern recognition systems are anything but impartial. On the contrary, algorithms learn from our data and therefore adopt everything that is part of it, including our racist, sexist, classist and ableist prejudices as well as heteronormative concepts. We are thus confronted with self-fulfilling prophecies, which are then relabeled as objective decision-making processes.

One of these prophecies is the assumption that any proximity between data points is ‘significant’ as such. This homophilic premise, which translates as ‘birds of a feather flock together’, is based in the US-history of segregated housing.

Wendy Hui Kyong Chun, who will give a talk as part of our lecture series, reminds us in her new book that it is precisely this principle of segregation that continues to govern our online world.

Every time Amazon, Facebook, TikTok or Tinder recommend and suggest new products, friends, content or lovers, we come across homophilic clusters that fuel the toxic online-climate today. However, instead of being inherent to the technology, the issue of data bias and algorithmic discrimination concerns the logics these technologies are made to implement and establish; logics that come from retrograde – not to say reactionary – identity politics, and a cancel culture actually deserving that name, which blocks all non-identitarian content, eventually filtering out all we prefer not to see.

EF

In summary one can therefore say basic technologies are driven by a capitalist logic and hegemonic politics. Which current alternative methods and systems promote anti-discriminatory behavior?

Is there a way out of this mess?

CA

At the heart of machine learning, which mostly catches our eye in the form of algorithmic recommender systems, is the simple logic of sifting through big chunks of data and turning them into information. This requires a particular logic or pattern that reflects our social behavior. After all, it is not the technologies that are male and white but the ideas they are fed. And this might be an opening for us: if we think about technologies as being a part of us, as reflections of ourselves, then we can finally engage with them properly.

Hito Steyerl’s work is certainly a prime example in this regard: she does not concern herself with technical solutions or simply downplays the technological conditions we live in; instead she focuses on the beliefs, myths and concrete ideological interests that are portrayed and adopted by these technologies.

Her narrative and visual language reject the glossy aesthetics of most digital art. Instead of reviewing the latest computer hardware or software, she presents ‘poor images’ that stir and enable discussions about digital technologies – ranging from social media and virtual worlds to recent AI applications.

Obviously, no technological innovation will be able to solve our social problems.

You simply cannot fight racism by using better data or algorithms. It is a political fight which requires political organization; online media can only ever play a supportive role here. This also applies to any critique of techno-capitalism, which needs to take place in a political context.

Big digital corporations must play by the rules of society – especially in terms of taxation and collective data management (‘data trusts’) – and need internal reorganization. And the solutions to meet these challenges are as old as the algorithms themselves. This can be best illustrated by the unionization of Amazon workers on Staten Island, New York, an event taking place while their boss, Jeff Bezos, was busy exploring outer space. One of the richest men on earth tries to escape this planet, while his workers unionize and fight for its and their survival. This is a poignant, almost cartoonish depiction of our current situation.

EF

Can art or media art and interdisciplinary practices assist in decolonizing technology? What role can art play?

CA

I think we need to be careful when using context-specific terminology and concepts – especially when they are derived from certain historical contexts that range from the Haitian Revolution to the struggles for independence after the Second World War.

After all, the ‘decolonization of technology’ seems to be a conflicting term, not least because decolonization describes the retrieval of stolen land, whereas in the case of technology our concern is with the understanding of our own cultural practices.

In an extended sense, though, we might speak of a decentralization of this knowledge by analyzing the – oftentimes violent – origins of technology.

There are several good examples, such as Luiza Prado de O Martins’ work, which focuses on alternative forms of knowledge and modern technology’s divergent points of origin – covering not only digital media but also other areas of interest like birth control.

And then there are also the artworks of Simon Denny, which render visible and thereby discuss the material conditions of our online world. His project ‘Mine’, for example, illustrates the nexus between the mining data and minerals – since both are integral to the functioning of our digital culture.

Especially the way artistic practices can engage with the production processes of our digital culture makes them viable to transcend current debates – which are mostly concerned with better data or better models – and create new technological visions.

In my opinion, Ramon Amaro, who will give a talk in early December, together with Tiara Roxanne, is one of few people who have truly accepted this challenge. Referencing theories by Frantz Fanon and Gilbert Simondon, he demands a radical break in the relationship with our technologies.

With a background in mechanical engineering as well as art history, he interprets technological systems, and especially AI, as reflections of ourselves with all the problems that come with it. His upcoming book, The Black Technical Object: On Machine Learning and the Aspiration of Black Being, focuses on the highly problematic and mostly racist history and logic of modern statistics, which form the basis of today’s machine learning systems. His demands therefore do not stop at improving these processes but press for an epistemological break, so as to be able to create something new.

EF

When we look at social media and its news feed algorithms, which do not only fuel the attention economy of its users but also divide them into separate bubbles, we can see it losing its utopian and emancipatory potential. The pandemic and the surrounding debate have instead underlined how social media contributes to social divisiveness. Still, what were the initial principles and ideas of social networks?

CA

In the early 2000s, the media theorist Tiziana Terranova already emphasised the significance of ‘free labour’ in digital societies. Her analysis has become even more relevant with social media platforms: on and with these platforms we produce a digital economy and profits that only benefit a few instead of the many. Sadly, we do this in the belief of ‘authenticity’;

everyone is convinced they are being creative, while in fact being caught in the behavioral patterns and restrictions imposed by tech companies. Bernard Stiegler once called this the ‘systemic stupidity’ of our digital culture.

However, what I like about Tiziana Terranova’s work is that she never tires to stress the ‘social’ in social media. As a matter of fact, social media algorithms are not interested in the individual person – their holiday photos or music collections – but rather in the relationships to other individuals in a given network; this ever-changing trans-individuality is still an important, yet underrated fact which we need to take into account when critiquing our capitalist digital society.

Generally, all technologies are no more than cristallisations of our social relations. For this reason, we also need to find new ways to engage with them. The theoretical field of Disability Studies could be an interesting point of departure, as it traditionally focuses on scrutinizing what we perceive as ‘normal’ in our culture.

As part of our lecture series, Katharina Klappheck will talk about machine learning processes in terms of disability and the unruly. And in March, Tung-Hui Hu will speak to us in a similar way about non-Western concepts of digital cultures in his lecture on analogue internet culture.

If we want to counter the divisive potential of these technologies, we should not – even in the face of justified criticism of our excessive mediated society – submit to any kind of cultural pessimism or technophobia. For there is always a playful approach to deal with our neuroses and technological dependencies.

Thank you Clemens Apprich for the Interview!

talk

01 Dec 2022, 19:00

The Trouble with Visibility

Lecture by Ramon Amaro & Performance by Tiara Roxanne, moderated by Nelly Y. Pinkrah. [Interpretation into sign language will be provided for this event / Talk will be held in English]

This event is part of the series Decolonizing Technology in cooperation with the department of Media Theory

Ramon Amaro explores how the history of data and statistical analysis provide a clear (and often sudden) grasp of the complex relationship between race and machine learning.

Amaro juxtaposes a practical analysis of machine learning with a theory of Black alienation in order to inspire alternative approaches to contemporary algorithmic practice. In doing so, Amaro offers a continuous contemplation on the abstruse nature of machine learning, mathematics, and the deep incursion of racial hierarchy.

Image by ©

Ramon Amaro’s writing, research and practice emerge at the intersections of Black Study, psychopathology, digital culture, and the critique of computation reason. He draws on Frantz Fanon’s theory of sociogenic alienation to problematise the de-localisation of the Black psyché in contemporary computational systems, such as machine learning and artificial intelligence.

His recent book The Black Technical Objectaims to introduce the history of statistical analysis and a knowledge of sociogenesis – a system of racism amenable to scientific explanation – into machine learning research as an act of impairing the racial ordering of the world.

While machine learning – computer programming designed for taxonomic patterning – provides useful insight into racism and racist behavior, a gap is present in the relationship between machine learning, the racial history of scientific explanation, and the Black lived experience.

Tiara Roxanne’s

work contends that AI is colonial in creation and nature, as its invention is founded on a settler colonial paradigm. In acknowledging this certainty, she states I cannot decolonize my body and performs as an incantation of survival and ancestral reconciliation. In performance, Roxanne is (dis)entangled within material and digital colonial borders of the past, present and future.

As part of an ongoing project entitled Red, through performance Tiara Roxanne will illustrate the relationship between Artificial Intelligence (AI) and Indigeneity.

By reinterpreting processes of colonial recovery through a performative and critical lens, Roxanne suggests that we will arrive at sovereign, Indigenous notions of digital borders, data colonialism and storytelling.

Image by © Credit: Agustín Farías

Tiara Roxanne

Tiara Roxanne is a Postdoctoral Fellow at Data & Society in NYC. They are a Tarascan Mestiza scholar and artist based in Berlin. Their research and artistic practice investigates the encounter between Indigeneity and AI by interrogating colonial structures embedded within machine learning systems.  As a performance artist and practitioner, Roxanne works between the digital and the material using textile. Currently their work is mediated through the color red.

Roxanne has presented at Images Festival (Toronto), Squeaky Wheel Film and Media Art Center (NY), Trinity Square Video (Toronto),  European Media Art Festival (Osnabrück), University of Applied Arts (Vienna),  SOAS (London), SLU (Madrid), Transmediale (Berlin), Duke University (NC), Tech Open Air (Berlin), AMOQA (Athens), Zurich University of the Arts (Zurich), Autonomous Intercultural Indigenous University (Columbia), Utrecht University (NL), University of California (San Diego), Münchener Kammerspiele (Munich),  Laboratorio Arte Alameda, (Mexico City),  among others.

Moderation by Nelly Y. Pinkrah

Nelly Y. Pinkrah is a cultural and media theorist and political activist, mainly involved in anti-racist, empowerment, and community-building projects. Areas of interest: (digital) media and technology, black studies & black feminist theory, political thoughts and practices, and cultural history.

talk

03 Nov 2022, 19:00

The Ecstasy of Noise

The one photo to end all wars – A Lecture by Hito Steyerl

This talk is part of the lecture series Decolonizing Technology in cooperation with the department of Media Theory

The one photo to end all wars – A Lecture by Hito Steyerl

The lecture will revolve around machine learning and the archive of war photography. Hito Steyerl will be present online.

Image by © Credit: Rolf Vennenbernd

Hito Steyerl is Professor of Experimental Film and Video at Berlin University of the Arts. She studied cinematography and documentary filmmaking in Tokyo and Munich and holds a PhD in Philosophy from the Academy of Fine Arts Vienna.

Her principal topics of interest are media, technology and the global circulation of images. Her texts, performances and essayistic documentary films are concerned with postcolonial critique and feminist logics of representation. Her works are usually a product of both visual art and film, theory and praxis. Many of them have been presented at the Venice Biennale, the Los Angeles Museum of Contemporary Art and the Museum of Modern Art in New York, among others.

She has also taught at the Center for Cultural Studies at Goldsmiths College in London and was visiting professor at the Royal Academy of Copenhagen and the Academy of Fine Arts in Helsinki.

Moderation: Clemens Apprich, Media Theory, University of Applied Arts

Welcome: Elisabeth Falkensteiner, Curator and Co-Head of Angewandte Interdisciplinary Lab