We study the lived experiences of users to understand their needs in technology, to identify practical barriers, and to turn those findings into concrete design directions for developers and practitioners.
Selected Publications
-
Beyond Visual Perception: Insights from Smartphone Interaction of Visually Impaired Users with Large Multimodal Models. CHI 2025.
An empirical analysis of how blind users interact with large multimodal models on smartphones.
Beyond Visual Perception: Insights from Smartphone Interaction of Visually Impaired Users with Large Multimodal Models. demo. -
Uncovering Human Traits in Determining Real and Spoofed Audio: Insights from Blind and Sighted Individuals. CHI 2024.
A comparative study of the cues used by blind and sighted listeners to detect spoofed audio. This paper challenges conventional wisdom in AI and security research by proposing perceptible, rather than imperceptible, watermarks for detecting AI-generated content. Among many findings, we noted that a simple change, removing breathing sounds from AI-generated audio, can serve as a reliable perceptible watermark for deepfake audio.
Uncovering Human Traits -
Are Two Heads Better than One? Investigating Remote Sighted Assistance with Paired Volunteers. DIS 2023.
A study of paired volunteers in remote sighted assistance and its effects on collaboration quality.
Are Two Heads Better than One? Investigating Remote Sighted Assistance with Paired Volunteers. demo. -
Understanding Screen Readers' Plugins. ASSETS 2021.
An empirical look at why and how blind users adopt screen-reader plugins.
Understanding Screen Readers' Plugins. demo.