AudioStrokes: Menu Selection through Auditory Feedback

Marilyn Mantei
Department of Computer Science
University of Toronto

Abstract

GUI interfaces rely on hand-eye coordination with users pointing and selecting items displayed on screens. Our work looks at how one might design auditory interfaces that allow users to navigate as effectively as they might in a visual world. Several problems are posed with this transfer of modality. One is the loss of 2-dimensional spatial information and the second is the loss of association, the visual closeness of items on the screen. This talk presents a variety of studies we have done on pie menus to overcome some of these difficulties and provide efficient input mechanisms that work equally well in visual and non-visual environments. We compare speech, music and unique sound feedback guidance and examine the change in Fitt's law behaviour with the hand-ear guidance we provide in our various interface designs.



Luis Gravano
gravano@cs.columbia.edu