Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods
- 1 September 1998
- journal article
- review article
- Published by Informa UK Limited in Human–Computer Interaction
- Vol. 13 (3), 203-261
- https://doi.org/10.1207/s15327051hci1303_2
Abstract
An interest in the design of interfaces has been a core topic for researchers and practitioners in the field of human-computer interaction (HCI); an interest in the design of experiments has not. To the extent that reliable and valid guidance for the former depends on the results of the latter, it is necessary that researchers and practitioners understand how small features of an experimental design can cast large shadows over the results and conclusions that can be drawn. In this review we examine the design of 5 experiments that compared usability evaluation methods (UEMs). Each has had an important influence on HCI thought and practice. Unfortunately, our examination shows that small problems in the way these experiments were designed and conducted call into serious question what we thought we knew regarding the efficacy of various UEMs. If the influence of these experiments were trivial, then such small problems could be safely ignored. Unfortunately, the outcomes of these experiments have been used to justify advice to practitioners regarding their choice of UEMs. Making such choices based on misleading or erroneous claims can be detrimental-compromising the quality and integrity of the evaluation, incurring unnecessary costs, or undermining the practitioner's credibility within the design team. The experimental method is a potent vehicle that can help inform the choice of a UEM as well as help to address other HCI issues. However, to obtain the desired outcomes, close attention must be paid to experimental design.Keywords
This publication has 22 references indexed in Scilit:
- A Guide to GOMS Model Usability Evaluation using NGOMSLPublished by Elsevier BV ,1997
- The GOMS family of user interface analysis techniquesACM Transactions on Computer-Human Interaction, 1996
- Using GOMS for user interface design and evaluationACM Transactions on Computer-Human Interaction, 1996
- Sample Sizes for Usability Studies: Additional ConsiderationsHuman Factors: The Journal of the Human Factors and Ergonomics Society, 1994
- Understanding usability issues addressed by three user-system interface evaluation techniquesInteracting with Computers, 1994
- Toward a deeper comparison of methodsPublished by Association for Computing Machinery (ACM) ,1994
- Project Ernestine: Validating a GOMS Analysis for Predicting and Explaining Real-World Task PerformanceHuman–Computer Interaction, 1993
- Comparison of empirical testing and walkthrough methods in user interface evaluationPublished by Association for Computing Machinery (ACM) ,1992
- The Acquisition and Performance of Text-Editing Skill: A Cognitive Complexity AnalysisHuman–Computer Interaction, 1990
- Research Methods in Human-Computer InteractionPublished by Elsevier BV ,1988