Taking off with ease or Face-off with Justice? Mapping Digital Citizenship and ‘Ways of Seeing’ the Indian Biometric State

Abstract: In November 2022, the Planning Commission of India or the Niti Aayog released a discussion paper titled ‘Responsible AI for All’ for public comments introducing Facial Recognition Technology (FRT) at domestic airports. While the document awaited feedback, earlier this month the Indian civil aviation ministry brought out the biometric system named as DigiYatra (an acronym for Digital travel) app for domestic passengers travelling from New Delhi airport. The official website promotes FRT as the ‘future of air travel’ encouraging users for voluntary participation in the policy. With face as the boarding pass, people, data and technologies thereby, intersect with one another promising ‘a simple and easy’ travel. However, the Indian case pertaining to use of FRT by law enforcement has drawn criticism from human rights activists for targeting individuals belonging to marginalized backgrounds. This notwithstanding the fact that the Indian capital city with 1862.6 CCTVs per mile, is the site of greatest government surveillance project internationally. There are additional concerns regarding accuracy of algorithms and, given that there are no Indian laws regarding data protection, the official recommendations cited making a case for ‘Responsible AI’ (RAI) have further raised concerns of legal experts regarding privacy violations as well. Situated within this context, my paper reviews the policy formulations listed in of DigiYatra that describes the futuristic images of FRT as desirable AI. It ‘looks’ into how a gaze of camera categorically renders the body of citizen-subject as visual portrait in a digital database, while the visual persuasion furbishes a process of governance that the state deems as just, efficient and good. The rise of the AI-enabled ethnographic state thus, in this manner reveals how the ocular-centric capital of FRT pivots on the promise of an efficient digital future, imaging ‘ways to see’, positing itself as currency of modern state rule, and ‘law by other means’.

Panelist bios: Madhavi Shukla is a doctoral candidate studying at Centre for the Study of Law and Governance, Jawaharlal Nehru University at New Delhi. She works on law and visuality, law and society, and cultures of technologies for her research. Additionally, she is part of a study group on ‘AI and Facial Recognition Technology’ anchored at the Centre for the Study of Law and Governance that has shared its recommendation on the Niti Aayog discussion paper titled ‘Responsible AI for All: Adopting the Framework – A use case approach on Facial Recognition Technology.’ As a legal feminist she is passionate about feminist jurisprudence and imagines possibilities for equal futures for everyone. When she is not dreaming of equality, her other hobbies involve tending to her plants and playing with cats.

Recorded Presentation | 26 April 2023

#FacialRecognition #Biometrics #DigitalState #Citizens #India

Previous
Previous

Data power, AI and the "doubtful citizens": The case of India's National Population Register

Next
Next

Contentious Others: Logo and Dilemmas of Difference in the US, Britain, and France