Supporting Non-Visual Data Sensemaking for Blind and Low Vision Users Using Touch and Sound
Department of Mechanical Engineering
Advisor: Sean Follmer
Committee: Sile O'Modhrain, Allison Okamura, Hari Subramonyam, Chris Chafe

Date: November 3, 2025 @ 1:30pm PT
Location: Stanford Building 520, Room 121
Join Via Zoom Passcode: 921110
Add to Calendar
Download Slides: (will be added the morning of)
Visualizations such as graphs and charts have become an essential way to communicate and reason about complex data. Yet their benefits often exclude people who are blind or have low vision (BLV). This talk explores how sound and touch can make data more accessible and meaningful for BLV individuals. I begin by examining current practices and lived experiences in accessing visualizations non-visually, highlighting both the promises and the challenges of existing approaches. I then present two new data exploration systems that leverage the unique strengths of auditory and haptic perception to enable richer and more accurate interpretations of data. Finally, through an extended co-design of interactive and multimodal data learning tools and activities, I reflect on how close collaboration with critical stakeholders can lead to systems that are not only functional but also empowering in their real-world contexts. Collectively, this work reimagines how people can experience and learn from data beyond sight to broaden participation in how we understand and communicate with data.