topic

Decolonizing Technology

What does it mean to decolonize technology, and how can it be that technology generates a colonizing practice in the first place?

‘Obviously, no technological innovation will be able to solve our social problems.’ (from our Interview with Clemens Apprich)

TOPIC CONTENT:

Exhibition View: Fabric of Dreams. Towards a Technodiversity

Wendy Hui Kyong Chun: The Rage of Data

Ramon Amaro: The Trouble with Visibility

In Conversation with Clemens Apprich on ‘Decolonizing Technology’

Loose Threads – Towards a Technodiversity

Loose Threads – Towards a Technodiversity

The Ecstasy of Noise

‘And this might be an opening for us: if we think about technologies as being a part of us, as reflections of ourselves, then we can finally engage with them properly.

At the heart of machine learning, which mostly catches our eye in the form of algorithmic recommender systems, is the simple logic of sifting through big chunks of data and turning them into information. This requires a particular logic or pattern that reflects our social behavior. After all, it is not the technologies that are male and white but the ideas they are fed.

[…] algorithms learn from our data and therefore adopt everything that is part of it, including our racist, sexist, classist and ableist prejudices as well as heteronormative concepts. We are thus confronted with self-fulfilling prophecies, which are then relabeled as objective decision-making processes.’*

Together with the department of Media Theory, we present lectures, talks and performances, mainly from the field of digital cultures, circling around the topic of ‘Decolonizing Technology’

With Wendy Chun, Nishant Shah, Hito Steyerl, Ramon Amaro, Tiara Roxanne, Katharina Klappheck, Mary Maggic, Tung-Hui Hu

*Excerpts from the Interview with Clemens Apprich, head of the Department of Media Theory, find the whole Interview below

image

Exhibition View: Fabric of Dreams. Towards a Technodiversity

Curated by Elisabeth Falkensteiner and Clemens Apprich. 17 Mar 2023 – 12 May 2023

A cooperation with the department of Media Theory, concluding the series of talks and performances on ‘Decolonizing Technology’

Today, the prospects of digital technologies are both auspicious and frightening: With digital assistance and whole new virtual environments, as well as promising advances in AI and machine learning on the one hand, and surveillance capitalism, discriminating datasets and life-threatening cyberwars on the other. How did this conflicting situation come about? To answer this question, we have to acknowledge that technologies are temporally and spatially produced, affected by different epistemes, ideologies, political interests, economic forces and cultural practices.

Accordingly, technological development is always fragmented.

Image by © Christian Freude/Christina Jauernik/Johann Lurf/Fabian Puttinger/Rüdiger Suppin Velvet Eyes, 2023, Installation, Auto-stereoscopic medium format slide projection. Projectors, wood, steel, ­plexiglass, paper, velvet, velcro, and engine (Photo: Paul Pibernig)
Image by © Luiza Prado O. Martins, Rome Is No Longer In Rome, It Is Wherever I Am, Installation, 3D Print, 2023 (Photo: Lea Dörl)
Image by © Anna Vasof, The Second Life of Burned Trees, Video, 2023 (Photo: Anna Vasof)
Image by © Mary Maggic, FASTER, HIGHER, STRONGER, Interactive Installation, Bioreactor, Fitness Bike, 2022 (Photo: Paul Pibernig)
Image by © Christina Gruber, Black Gold, Installation, Prints, 2020 (Photo: Lea Dörl)
Image by © kennedy+swan, Manifesto of Fragility (Morning Routine), Video, 2022 (Photo: Lea Dörl)
Image by © Cyrus Kabiru, C-Stunners Series, Photographies, 2012–22 (Photo: Lea Dörl)
Image by © Anna Vasof, Things and Wonders, Series of Video Works, 2018–2023 (Photo: Lea Dörl)
Image by © Christiane Peschek, OASIS, Mix Media Installation, 2022 (Photo: Paul Pibernig)
Image by © Kumbirai Makumbe, Living Doesn't Mean Your'e Alive, Video and Sculptures, 2021 (Photo: Paul Pibernig)
Image by © kennedy+swan, Delphi Demons, Stereoscopic film, 2022 (Photo: kennedy+swan, video still)

video

Wendy Hui Kyong Chun: The Rage of Data

Talk in English from October 2022

Wendy Hui Kyong Chun in Conversation with Nishant Shah – Moderation: Clemens Apprich

About the series Decolonizing Technology

Participants:

Wendy Hui Kyong Chun, Director of the Digital Democracies Institute, Simon Fraser University, Canada

Nishant Shah, Chair Professor Aesthetics and Cultures of Technology, ArtEZ University of the Arts / Radboud University, The Netherlands

Moderated by Clemens Apprich, Head of the Departmentent Media Theory, University of Applied Arts Vienna

Reception by Gerald Bast, Rector of the University of Applied Arts

In her new book Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal – not an error – within big data and machine learning.

These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Chun, who has a background in media studies and cultural theory as well as systems design engineering, will talk about her book with Nishant Shah. The conversation will revolve around the question of how big data and machine learning encode discrimination, create agitated clusters of comforting rage and call for alternative algorithms to foster a more democratic future.

How can we release ourselves from the vice-like grip of discriminatory data?

Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.

Wendy Chun has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data.

Nishant Shah is a feminist, humanist, technologist working in digital cultures. He wears many hats as an academic, researcher, educator and annotator, interested in translating research for public discourse and being informed by public discourse to orient his research.

In his official capacity, he is Chair and Professor of Aesthetics and Cultures of Technologies at ArtEZ University of the Arts (NL), Endowed Professor in Cultural Studies at the Radboud University (NL), as well as Knowledge Partner to Oxfam Novib (NL) and the Digital Asia Hub (Hong Kong/ Singapore). His work is at the intersections of body, identity, digital technologies, artistic practice, and activism. His current interest is in thinking through questions digital narrative practices towards building inclusive, diverse, resilient, and equitable societies.

The event is part of the series Decolonizing Technology in cooperation with the department of Media Theory

video

Ramon Amaro: The Trouble with Visibility

Lecture in English from 2022

Accompanied by a Dialogue with Tiara Roxanne, moderated by Nelly Y. Pinkrah

About the Series ‘Decolonizing Technology

Ramon Amaro explores how the history of data and statistical analysis provide a clear (and often sudden) grasp of the complex relationship between race and machine learning.

Amaro juxtaposes a practical analysis of machine learning with a theory of Black alienation in order to inspire alternative approaches to contemporary algorithmic practice. In doing so, Amaro offers a continuous contemplation on the abstruse nature of machine learning, mathematics, and the deep incursion of racial hierarchy.

Ramon Amaro’s writing, research and practice emerge at the intersections of Black Study, psychopathology, digital culture, and the critique of computation reason. He draws on Frantz Fanon’s theory of sociogenic alienation to problematise the de-localisation of the Black psyché in contemporary computational systems, such as machine learning and artificial intelligence.

His recent book The Black Technical Objectaims to introduce the history of statistical analysis and a knowledge of sociogenesis – a system of racism amenable to scientific explanation – into machine learning research as an act of impairing the racial ordering of the world.

While machine learning – computer programming designed for taxonomic patterning – provides useful insight into racism and racist behavior, a gap is present in the relationship between machine learning, the racial history of scientific explanation, and the Black lived experience.

Tiara Roxanne is a Postdoctoral Fellow at Data & Society in NYC. They are a Tarascan Mestiza scholar and artist based in Berlin. Their research and artistic practice investigates the encounter between Indigeneity and AI by interrogating colonial structures embedded within machine learning systems. As a performance artist and practitioner, Roxanne works between the digital and the material using textile. Currently their work is mediated through the color red.

Roxanne has presented at Images Festival (Toronto), Squeaky Wheel Film and Media Art Center (NY), Trinity Square Video (Toronto), European Media Art Festival (Osnabrück), University of Applied Arts (Vienna), SOAS (London), SLU (Madrid), Transmediale (Berlin), Duke University (NC), Tech Open Air (Berlin), AMOQA (Athens), Zurich University of the Arts (Zurich), Autonomous Intercultural Indigenous University (Columbia), Utrecht University (NL), University of California (San Diego), Münchener Kammerspiele (Munich), Laboratorio Arte Alameda, (Mexico City), among others.

Nelly Y. Pinkrah is a cultural and media theorist and political activist, mainly involved in anti-racist, empowerment, and community-building projects. Areas of interest: (digital) media and technology, black studies & black feminist theory, political thoughts and practices, and cultural history.

text

In Conversation with Clemens Apprich on ‘Decolonizing Technology’

About media technology and its political implications in terms of identity politics and the importance of critical thinking in digital cultures. What does it mean to ‘decolonize’ technology, and how can it be that technology generates a ‘colonizing’ practice in the first place?

Clemens Apprich is head of the department of Media Theory, which co-hosts AIL’s Talk Series on the topic of ‘Decolonizing Technology’

Read more on the subject (in German)

Image by © Foto: Universität für angewandte Kunst / Adam Berry, transmediale

Clemens Apprich

is full professor and head of the department of Media Theory at Angewandte. His current research focuses on filter algorithms and their use in data analysis procedures as well as machine learning methods. He is author of Technotopia: A Media Genealogy of Net Cultures (Rowman & Littlefield International, 2017) and co-editor of the book Pattern Discrimination together with Hito Steyerl, Florian Cramer and Wendy Chun (University of Minnesota Press, 2018)

Elisabeth Falkensteiner (Curator and Co-Head of AIL) talked to Clemens Apprich about media technology and its political implications in terms of identity politics and the importance of critical thinking in digital cultures.

Elisabeth Falkensteiner (EF)

In recent times, the political visions or, rather, the naïve utopian dream of a fundamental democratization of the internet has been dulled just as the hope has faded that artificial intelligence and machine learning will develop ‘neutral’ digital worlds.

In fact, new technologies are even more likely to entrench inequality, reinforce racist and sexist tendencies and promote reactionary identity politics – which will have consequences in virtual space as well as in real life. After all, most 90s online cultures had their roots in activism and subcultures.

Where did we go wrong?

Clemens Apprich (CA)

Many technologies that constitute what we call the online world were created in the 90s – at a time when the future of the internet was still undecided and subject of passionate debates. The mass distribution of so-called web 2.0 applications, building on the initial internet infrastructure and known as ‘social media’ platforms today, has led to a shift in the balance of power: away from public, social and artistic positions towards commercial interests. The use of these new media platforms has also led to a staggering increase in digital data. However, an enormous amount of computational power is required to generate ‘valuable’ information from this data, something only global corporations can afford these days.

And these automated pattern recognition systems are anything but impartial. On the contrary, algorithms learn from our data and therefore adopt everything that is part of it, including our racist, sexist, classist and ableist prejudices as well as heteronormative concepts. We are thus confronted with self-fulfilling prophecies, which are then relabeled as objective decision-making processes.

One of these prophecies is the assumption that any proximity between data points is ‘significant’ as such. This homophilic premise, which translates as ‘birds of a feather flock together’, is based in the US-history of segregated housing.

Wendy Hui Kyong Chun, who will give a talk as part of our lecture series, reminds us in her new book that it is precisely this principle of segregation that continues to govern our online world.

Every time Amazon, Facebook, TikTok or Tinder recommend and suggest new products, friends, content or lovers, we come across homophilic clusters that fuel the toxic online-climate today. However, instead of being inherent to the technology, the issue of data bias and algorithmic discrimination concerns the logics these technologies are made to implement and establish; logics that come from retrograde – not to say reactionary – identity politics, and a cancel culture actually deserving that name, which blocks all non-identitarian content, eventually filtering out all we prefer not to see.

EF

In summary one can therefore say basic technologies are driven by a capitalist logic and hegemonic politics. Which current alternative methods and systems promote anti-discriminatory behavior?

Is there a way out of this mess?

CA

At the heart of machine learning, which mostly catches our eye in the form of algorithmic recommender systems, is the simple logic of sifting through big chunks of data and turning them into information. This requires a particular logic or pattern that reflects our social behavior. After all, it is not the technologies that are male and white but the ideas they are fed. And this might be an opening for us: if we think about technologies as being a part of us, as reflections of ourselves, then we can finally engage with them properly.

Hito Steyerl’s work is certainly a prime example in this regard: she does not concern herself with technical solutions or simply downplays the technological conditions we live in; instead she focuses on the beliefs, myths and concrete ideological interests that are portrayed and adopted by these technologies.

Her narrative and visual language reject the glossy aesthetics of most digital art. Instead of reviewing the latest computer hardware or software, she presents ‘poor images’ that stir and enable discussions about digital technologies – ranging from social media and virtual worlds to recent AI applications.

Obviously, no technological innovation will be able to solve our social problems.

You simply cannot fight racism by using better data or algorithms. It is a political fight which requires political organization; online media can only ever play a supportive role here. This also applies to any critique of techno-capitalism, which needs to take place in a political context.

Big digital corporations must play by the rules of society – especially in terms of taxation and collective data management (‘data trusts’) – and need internal reorganization. And the solutions to meet these challenges are as old as the algorithms themselves. This can be best illustrated by the unionization of Amazon workers on Staten Island, New York, an event taking place while their boss, Jeff Bezos, was busy exploring outer space. One of the richest men on earth tries to escape this planet, while his workers unionize and fight for its and their survival. This is a poignant, almost cartoonish depiction of our current situation.

EF

Can art or media art and interdisciplinary practices assist in decolonizing technology? What role can art play?

CA

I think we need to be careful when using context-specific terminology and concepts – especially when they are derived from certain historical contexts that range from the Haitian Revolution to the struggles for independence after the Second World War.

After all, the ‘decolonization of technology’ seems to be a conflicting term, not least because decolonization describes the retrieval of stolen land, whereas in the case of technology our concern is with the understanding of our own cultural practices.

In an extended sense, though, we might speak of a decentralization of this knowledge by analyzing the – oftentimes violent – origins of technology.

There are several good examples, such as Luiza Prado de O Martins’ work, which focuses on alternative forms of knowledge and modern technology’s divergent points of origin – covering not only digital media but also other areas of interest like birth control.

And then there are also the artworks of Simon Denny, which render visible and thereby discuss the material conditions of our online world. His project ‘Mine’, for example, illustrates the nexus between the mining data and minerals – since both are integral to the functioning of our digital culture.

Especially the way artistic practices can engage with the production processes of our digital culture makes them viable to transcend current debates – which are mostly concerned with better data or better models – and create new technological visions.

In my opinion, Ramon Amaro, who will give a talk in early December, together with Tiara Roxanne, is one of few people who have truly accepted this challenge. Referencing theories by Frantz Fanon and Gilbert Simondon, he demands a radical break in the relationship with our technologies.

With a background in mechanical engineering as well as art history, he interprets technological systems, and especially AI, as reflections of ourselves with all the problems that come with it. His upcoming book, The Black Technical Object: On Machine Learning and the Aspiration of Black Being, focuses on the highly problematic and mostly racist history and logic of modern statistics, which form the basis of today’s machine learning systems. His demands therefore do not stop at improving these processes but press for an epistemological break, so as to be able to create something new.

EF

When we look at social media and its news feed algorithms, which do not only fuel the attention economy of its users but also divide them into separate bubbles, we can see it losing its utopian and emancipatory potential. The pandemic and the surrounding debate have instead underlined how social media contributes to social divisiveness. Still, what were the initial principles and ideas of social networks?

CA

In the early 2000s, the media theorist Tiziana Terranova already emphasised the significance of ‘free labour’ in digital societies. Her analysis has become even more relevant with social media platforms: on and with these platforms we produce a digital economy and profits that only benefit a few instead of the many. Sadly, we do this in the belief of ‘authenticity’;

everyone is convinced they are being creative, while in fact being caught in the behavioral patterns and restrictions imposed by tech companies. Bernard Stiegler once called this the ‘systemic stupidity’ of our digital culture.

However, what I like about Tiziana Terranova’s work is that she never tires to stress the ‘social’ in social media. As a matter of fact, social media algorithms are not interested in the individual person – their holiday photos or music collections – but rather in the relationships to other individuals in a given network; this ever-changing trans-individuality is still an important, yet underrated fact which we need to take into account when critiquing our capitalist digital society.

Generally, all technologies are no more than cristallisations of our social relations. For this reason, we also need to find new ways to engage with them. The theoretical field of Disability Studies could be an interesting point of departure, as it traditionally focuses on scrutinizing what we perceive as ‘normal’ in our culture.

As part of our lecture series, Katharina Klappheck will talk about machine learning processes in terms of disability and the unruly. And in March, Tung-Hui Hu will speak to us in a similar way about non-Western concepts of digital cultures in his lecture on analogue internet culture.

If we want to counter the divisive potential of these technologies, we should not – even in the face of justified criticism of our excessive mediated society – submit to any kind of cultural pessimism or technophobia. For there is always a playful approach to deal with our neuroses and technological dependencies.

Thank you Clemens Apprich for the Interview!

discussion

04 May 2023, 17:00

Loose Threads – Towards a Technodiversity

Panel with Kristoffer Gansing, Tung-Hui Hu
, Luiza Prado de O. Martins
 / Moderation: Clemens Apprich & Elisabeth Falkensteiner

Extensive Program to the Exhibition Fabric of Dreams – Towards a Technodiversity. In cooperation with the department of Media Theory.

The side program entitled Loose Threads weaves together different and diverse understanding of technology. The panel discussion looks at the reductive history of technology and its mode of representation.

The current exhibition Fabric of Dreams. Towards a Technodiversity raises the question of how we can escape the singular vision of technology and its rather definitive configuration as a fully automated machine intelligence. According to the Western conception, technology is described as a history of development on the way to progression and expansion, be it in a dystopian or utopian implementation. Drawing on Foucault's discourse analysis of knowledge systems, philosopher Yuk Hui argues for different epistemes in relation to the narrative of technology, which he calls Technodiversity.

Panelist:

Kristoffer Gansing

Kristoffer Gansing is a media theorist and curator with a PhD in media studies from the School of Arts & Communication at Malmö University in Sweden. He is currently professor of Artistic Research at the Royal Danish Academy of Fine Arts in Copenhagen where he also directs the International Center for Knowledge in the Arts. Previously, Mr. Gansing was artistic director of the art and digital culture festival, Transmediale, in Berlin, where he directed nine editions from 2012 to 2020. 



Image by ©

Intersecting art, research and media technology, Gansing’s work has often taken the form of critical interrogations of art and technology from a post-digital perspective where digitisation forms part of everyday life. His PhD Transversal Media Practices (2013), included two case-studies on how media archaeological art practices reconfigure linear conceptions of technological development and was published by Malmö University Press in 2013. With Ryan Bishop, Jussi Parikka and Elvia Wilk he edited across & beyond – a transmediale reader on Post-digital Practices, Concepts, and Institutions published by Sternberg Press in 2016. In 2020, he edited The Eternal Network – The Ends and Becomings of Network Culture published through the Institute of Network Cultures in Amsterdam. His current research focuses on the techno-aesthetics of infrastructure, and will be published in a forthcoming short form in 2022. Through his research, curatorial activity and writing, Gansing develops transversal and post-digital perspectives and methodologies, that aim to both situate and transform cultural practices. He is particularly interested in moving beyond representational forms in favour of situated and performative aesthetics.



Tung-Hui Hu

Poet, former network engineer and scholar of digital media Tung-Hui Hu will give insights into his new project about digital infrastructure in the global South. In his book A Prehistory of the Cloud he investigates ideologies behind digital technology and offers a set of new tools for rethinking the contemporary digital environment.

Image by ©

Hu currently lives in Rome, where he is a 2022-23 Rome Prize Fellow in Literature at the American Academy in Rome. He is the author of three books of poetry, most recently Greenhouses, Lighthouses, which grew out of his graduate studies in film, as well as two studies of digital culture, A Prehistory of the Cloud and Digital Lethargy: Dispatches from an Age of Disconnection, an exploration of burnout, isolation, and disempowerment in the digital underclass.

A former network engineer, he is an associate professor of English at the University of Michigan. He is at work on two new projects: a book of poems on the idea of punishment, and a book on digital infrastructure in the global South. He has received an American Academy in Berlin Prize for his research and a National Endowment for the Arts fellowship for his poetry. His poems have been published in places such as Boston Review, The New Republic, Ploughshares, the Academy of American Poets's Poem-a-Day, and the anthology Family Resemblance: An Anthology and Exploration of Hybrid Literary Genres. In 2023-24, he will be a Humboldt Research Fellow at Martin-Luther-Universität Halle, Germany.

Luiza Prado de O. Martins

Luiza Prado de O. Martins’s artistic exploration is examining plant-human relations, reproduction and herbal medicine and questions of reproductive rights from a feminist and anti-colonial lens. In her installation within the exhibition she is investigating the commodification of a high-valued but extinct plant and the insertion into early monetary systems, as a capitalist technology.

Image by ©

Luiza Prado de O. Martins is an artist, writer, educator, and researcher investigating plant-human relations, reproduction, herbal medicine, and radical, decolonising care. Her body of artistic work spans video, food, performance, and sculpture, examining questions of reproductive rights from a feminist and anti-colonial lens, with a particular interest in herbalist medicinal practices. Her ongoing artistic research project, “Un/Earthings and Moon Landings” narrates the extinction and later reappearance of an ancient contraceptive, aphrodisiac and spice, called silphium, through a series of artworks. The project explores the limits of archival practices in landscapes affected by anthropogenic climate change. In the past, Prado has exhibited work at the Art Institute of Chicago, the Museum of Modern Art Warsaw, Haus der Kulturen der Welt, Savvy Contemporary, Akademie Schloss Solitude, and Kampnagel, among others. She is part of the art duo We Work in the Dark, together with musician and artist Obaro Ejimiwe/Ghostpoet.

Program Overview

17:00–24:00

5 pm Exhibition tour with artists and curators

6 pm Panel with:

  • Kristoffer Gansing 


  • Tung-Hui Hu


  • Luiza Prado de O. Martins


  • Moderation: Clemens Apprich & Elisabeth Falkensteiner

8 pm Performance Beyond My Skin
A project by Flavia Mazzanti, produced by Immerea, Performers: Olivia Hild, Imani Rameses, Sound: Brootworth

9 pm Sound Performance by
Dr. Obaro Ejimiwe/Ghostpoet

10 pm DJ-Set by
isocialbutterflyy

More info

performance

04 May 2023, 20:00

Loose Threads – Towards a Technodiversity

Performances Beyond My Skin, Dr. Obaro Ejimiwe (Ghostpoet), isocialbutterflyy (DJ-set)

Extensive Program to the Exhibition Fabric of Dreams – Towards a Technodiversity. In cooperation with the department of Media Theory.

About the series ‘Decolonizing Technology’

About the Exhibition

Loose Threads Discourse

Image by ©

Live Performance

Beyond My Skin

A project by Flavia Mazzanti
Produced by Immerea
Performers: Olivia Hild, Imani Rameses
Sound: Brootworth

Beyond My Skin is an interactive installation here presented in the form of a live performance that explores themes of identity and digital inclusion. The project creates a phygital space where two performers can experience a new form of self-awareness and representation outside of traditional binary and societal portrayals (avatars). Translated movements generate hybrid digital bodies through interactions, creating a new collective consciousness of where our bodies begin and end. The performers explore the hybrid relationship between bodies and their digital representation in real-time, investigating something as physical as the feeling of "touch" and its meaning in the digital realm. Is it possible to be physically apart but touch each other digitally? And how would this affect our physical bodies?

Image by ©

Live Sound Performance

Dr. Obaro Ejimiwe (Ghostpoet)

Dr. Obaro Ejimiwe is a visual artist and musician whose work examines themes around African spiritualism, colonisation, masculinity, identity and black joy.

Image by © Credit: Dila Kaplan

DJ-Set

isocialbutterflyy (POSSY/she/her)

isocialbutterflyy, artist, costume designer and dj, based in Vienna founded POSSY in 2017 due to a lack of FLINTA* visibility in Hamburgs Club culture. In her work, she is investigating on interdisciplinary and unknown images in sound, film and theatre.

Musicwise, spherical surfaces get mixed with housy and breaking beats and influences from diverse genres, such as pop, jungle, rnb and trance.

About the exhibition:

The current exhibition Fabric of Dreams. Towards a Technodiversity raises the question of how we can escape the singular vision of technology and its rather definitive configuration as a fully automated machine intelligence. According to the Western conception, technology is described as a history of development on the way to progression and expansion, be it in a dystopian or utopian implementation. Drawing on Foucault's discourse analysis of knowledge systems, philosopher Yuk Hui argues for different epistemes in relation to the narrative of technology, which he calls Technodiversity.

Program Overview

17:00–24:00

Exhibition tour with artists and curators

Panel with:

  • Kristoffer Gansing 


  • Tung-Hui Hu


  • Luiza Prado de O. Martins


  • Moderation: Clemens Apprich & Elisabeth Falkensteiner

Performance Beyond My Skin
A project by Flavia Mazzanti,
produced by Immerea, Performers: Olivia Hild, Imani Rameses, Sound: Brootworth

Sound Performance by
Obaro Ejimiwe

DJ-Set by
isocialbutterflyy

More info Discourse


talk

03 Nov 2022, 19:00

The Ecstasy of Noise

The one photo to end all wars – A Lecture by Hito Steyerl

This talk is part of the lecture series Decolonizing Technology in cooperation with the department of Media Theory

The one photo to end all wars – A Lecture by Hito Steyerl

The lecture will revolve around machine learning and the archive of war photography. Hito Steyerl will be present online.

Image by © Credit: Rolf Vennenbernd

Hito Steyerl is Professor of Experimental Film and Video at Berlin University of the Arts. She studied cinematography and documentary filmmaking in Tokyo and Munich and holds a PhD in Philosophy from the Academy of Fine Arts Vienna.

Her principal topics of interest are media, technology and the global circulation of images. Her texts, performances and essayistic documentary films are concerned with postcolonial critique and feminist logics of representation. Her works are usually a product of both visual art and film, theory and praxis. Many of them have been presented at the Venice Biennale, the Los Angeles Museum of Contemporary Art and the Museum of Modern Art in New York, among others.

She has also taught at the Center for Cultural Studies at Goldsmiths College in London and was visiting professor at the Royal Academy of Copenhagen and the Academy of Fine Arts in Helsinki.

Moderation: Clemens Apprich, Media Theory, University of Applied Arts

Welcome: Elisabeth Falkensteiner, Curator and Co-Head of Angewandte Interdisciplinary Lab