The Dyadic Interaction Assistant for Individuals with Visual Impairments
Description
This paper presents an overview of The Dyadic Interaction Assistant for Individuals with Visual Impairments with a focus on the software component. The system is designed to communicate facial information (facial Action Units, facial expressions, and facial features) to an individual with visual impairments in a dyadic interaction between two people sitting across from each other. Comprised of (1) a webcam, (2) software, and (3) a haptic device, the system can also be described as a series of input, processing, and output stages, respectively. The processing stage of the system builds on the open source FaceTracker software and the application Computer Expression Recognition Toolbox (CERT). While these two sources provide the facial data, the program developed through the IDE Qt Creator and several AppleScripts are used to adapt the information to a Graphical User Interface (GUI) and output the data to a comma-separated values (CSV) file. It is the first software to convey all 3 types of facial information at once in real-time. Future work includes testing and evaluating the quality of the software with human subjects (both sighted and blind/low vision), integrating the haptic device to complete the system, and evaluating the entire system with human subjects (sighted and blind/low vision).
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2013-05
Agent
- Author (aut): Brzezinski, Chelsea Victoria
- Thesis director: Balasubramanian, Vineeth
- Committee member: McDaniel, Troy
- Committee member: Venkateswara, Hemanth
- Contributor (ctb): Barrett, The Honors College
- Contributor (ctb): Computer Science and Engineering Program