Using nonspeech sounds to provide navigation cues
- 1 September 1998
- journal article
- Published by Association for Computing Machinery (ACM) in ACM Transactions on Computer-Human Interaction
- Vol. 5 (3), 224-259
- https://doi.org/10.1145/292834.292839
Abstract
This article describes 3 experiments that investigate the possibiity of using structured nonspeech audio messages called earcons to provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and 4 levels was created with an earcon for each node. Rules were defined for the creation of hierarchical earcons at each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results of the first experiment showed that participants could identify their location with 81.5% accuracy, indicating that earcons were a powerful method of communicating hierarchy information. One proposed use for such navigation cues is in telephone-based interfaces (TBIs) where navigation is a problem. The first experiment did not address the particular problems of earcons in TBIs such as “does the lower quality of sound over the telephone lower recall rates,” “can users remember earcons over a period of time.” and “what effect does training type have on recall?” An experiment was conducted and results showed that sound quality did lower the recall of earcons. However; redesign of the earcons overcame this problem with 73% recalled correctly. Participants could still recall earcons at this level after a week had passed. Training type also affected recall. With personal training participants recalled 73% of the earcons, but with purely textual training results were significantly lower. These results show that earcons can provide good navigation cues for TBIs. The final experiment used compound, rather than hierarchical earcons to represent the hierarchy from the first experiment. Results showed that with sounds constructed in this way participants could recall 97% of the earcons. These experiments have developed our general understanding of earcons. A hierarchy three times larger than any previously created was tested, and this was also the first test of the recall of earcons over time.Keywords
This publication has 12 references indexed in Scilit:
- Parallel earcons: reducing the length of audio messagesInternational Journal of Human-Computer Studies, 1995
- Increasing the Usability of Interactive Voice Response Systems: Research and Guidelines for Phone-Based InterfacesHuman Factors: The Journal of the Human Factors and Ergonomics Society, 1995
- The design and evaluation of an auditory-enhanced scrollbarPublished by Association for Computing Machinery (ACM) ,1994
- An evaluation of earcons for use in auditory human-computer interfacesPublished by Association for Computing Machinery (ACM) ,1993
- VoiceNotesPublished by Association for Computing Machinery (ACM) ,1993
- The use of icons, earcons, and commands in the design of an online hierarchical menuIEEE Transactions on Dependable and Secure Computing, 1991
- Effective sounds in complex systemsPublished by Association for Computing Machinery (ACM) ,1991
- Earcons and Icons: Their Structure and Common Design Principles (Abstract only)ACM SIGCHI Bulletin, 1989
- The SonicFinder: An Interface that Uses Auditory Icons (Abstract Only)ACM SIGCHI Bulletin, 1989
- Using synthetic speech for remote access to informationBehavior Research Methods, Instruments & Computers, 1985