Shared Privacy Concerns of the Visually Impaired and Sighted Bystanders with Camera-Based Assistive Technologies
- Taslima Akter ,
- Tousif Ahmed ,
- Apu Kapadia ,
- Swami Manohar
ACM Transactions on Accessible Computing (TACCESS) | , Vol 15: pp. 1-33
Camera-based assistive technologies can provide people with visual impairments (PVIs) visually derived information about people in their vicinity. Furthermore, the advent of smart glasses offers the possibility of not only analyzing visual information in front of the wearers but also behind them through an extended field of view. Although such ‘visually available’ information can enhance one’s social interactions, the privacy and ethical implications for automated judgments about bystanders, especially from the perspective of PVIs, remains underexplored. To study the concerns of both bystanders and PVIs with such technologies, we conducted two online surveys with visually impaired participants as wearers (N = 128) and sighted participants as bystanders (N = 136). Although PVIs found some types of information to be improper or impolite (such as someone’s weight), our overarching finding is the shared ethical concern between PVIs and bystanders related to the fallibility of AI, in which bystanders can be misrepresented (algorithmically) by the devices. These mischaracterizations can range from occasional unexpected algorithmic errors (e.g., errors in facial recognition) to the questionable use of AI for determining subjective social constructs (such as gender). Based on our findings, we discuss the design implications and directions for future work in the development of camera-based assistive technologies while mitigating the ethical concerns of PVIs and bystanders.