Description
Although many data visualization diagrams can be made accessible for individuals who are blind or visually impaired, they often do not present the information in a way that intuitively allows readers to easily discern patterns in the data. In particular, accessible node graphs tend to use speech to describe the transitions between nodes. While the speech is easy to understand, readers can be overwhelmed by too much speech and may not be able to discern any structural patterns which occur in the graphs. Considering these limitations, this research seeks to find ways to better present transitions in node graphs.
This study aims to gain knowledge on how sequence patterns in node graphs can be perceived through speech and nonspeech audio. Users listened to short audio clips describing a sequence of transitions occurring in a node graph. User study results were evaluated based on accuracy and user feedback. Five common techniques were identified through the study, and the results will be used to help design a node graph tool to improve accessibility of node graph creation and exploration for individuals that are blind or visually impaired.
This study aims to gain knowledge on how sequence patterns in node graphs can be perceived through speech and nonspeech audio. Users listened to short audio clips describing a sequence of transitions occurring in a node graph. User study results were evaluated based on accuracy and user feedback. Five common techniques were identified through the study, and the results will be used to help design a node graph tool to improve accessibility of node graph creation and exploration for individuals that are blind or visually impaired.
Details
Title
- Developing a Node Graph Tool: Pattern Recognition Through Sound
Contributors
- Darmawaskita, Nicole (Author)
- McDaniel, Troy (Thesis director)
- Duarte, Bryan (Committee member)
- Computer Science and Engineering Program (Contributor, Contributor)
- Barrett, The Honors College (Contributor)
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2019-12
Resource Type
Collections this item is in