Sharp Image, Vague Face: Disrupting the Facial Transparency in A.I. through a Diasporic Approach

Abstract: Algorithm bias occurs when there is a lack of data diversity. A commonly adopted solution is to better an A.I. with datasets coming from minorities, implying a more general and severe process of data reaping. Unfortunately, recent discussion of A.I. ethics fails to consider this procedure as a constant exposure of the marginalized, including the diaspora, and latent risks brought by inevitable watching and listening. Meanwhile, the potential of the diaspora’s elusive identity has gained scant attention when reflecting on a possible way to resist persistent contemplation from the dominant. Based on this knowledge gap, my research criticizes compulsory transparency in facial recognition as an expression of power exercise while imagining an alternative and indistinct A.I. ethics originating from the diaspora. First, my study elucidates how power is exercised by pursuing face transparency and certainty. It further elaborates on how the process of facial dataset-making colludes with colonial photography on this point. Second, my research unpacks a poetic opacity that originates from the nomadic identity of the diaspora. Such ambiguity has the potential to contribute to the A.I. ethics focusing on marginalized communities and to resist top-down viewing. In addition, my study uses the documentary Welcome to Chechnya as a case study. It argues that the obfuscation created by Deep Fake technology in this work not only protects the Chechen LGBTQ diaspora’s privacy and dignity but also gives rise to a chance to challenge the totalitarian surveillance system. Last, my research articulates that the potentiality of A.I. for the weak lies not in how accurate and transparent an algorithm can be but to which extent those people can retain their opacity and invisibility with A.I. in front of the viewing entangled with power.

Author bio: Yifeng Wei is an artist, curator, and PhD candidate in Visual Culture at the National College of Art and Design in Ireland. While reflecting on the legacy of cybernetics and system theory, Wei’s research interests lie in digital colonisation and emancipation, as well as resistance against algorithm bias and surveillance capitalism. His study relates current technological surveillance to the desire for certainty in cybernetics and system theory and manages to find a possible way of resistance by resorting to the aesthetics of opacity. Wei’s investigation of such aesthetics involves writing an alternative art history focusing on anonymous and incognito artists. Also, it touches upon the analysis of artistic practices that protect and liberate the oppressed by adopting nontransparent technologies, including the black box mechanism in artificial intelligence. Revolving around artists who apply such a nebulous approach to resist the power structure looming behind technology society, his recent curatorial practice, “The Cloud of Unknowing”, was shortlisted as a finalist for Hyundai Blue Prize Art+Tech 2023.

Recorded Presentation | 27 April 2023

#Datasets #DecolonialApproaches #Diaspora #SocialJustice #MarginalizedCommunities

Previous
Previous

AI's Colonial Archives

Next
Next

Artificial intelligence as a decolonisation tool: Lessons from libraries, archives and museums