Skip to main content

Research Repository

Advanced Search

Applying user testing data to UEM performance metrics.

Chattratichart, Jarinee; Brodie, Jaqueline

Authors

Jarinee Chattratichart



Abstract

The lack of standard assessment criteria for reliably comparing usability evaluation methods (UEMs) is an important gap in HCI knowledge. Recently, metrics for assessing thoroughness, validity, and effectiveness of UEMs, based on user data, have been proposed to bridge this gap. This paper reports our findings of applying these proposed metrics in a study that compared heuristic evaluation (HE) to HE-Plus (an extended version of HE). Our experiment showed better overlap among the HE-Plus evaluators than the HE evaluators, demonstrating greater reliability of the method. When evaluation data, from testing the usability of the same website, was used in calculating the UEM performance metrics, HE-Plus was found to be a superior method to HE in all assessment criteria with a 17%, 39%, and 67% improvement in the aspects of thoroughness, validity, and effectiveness, respectively. The paper concludes with a discussion concerning the limitations of the effectiveness of the UEM from which the real users' data was obtained.

Citation

Chattratichart, J., & Brodie, J. (2004). Applying user testing data to UEM performance metrics. In CHI EA '04 CHI '04 Extended Abstracts on Human Factors in Computing Systems, 1119-1122. doi:10.1145/985921.986003

Conference Name Extended abstracts of the 2004 conference on Human factors and computing systems - CHI '04
Start Date Apr 24, 2004
End Date Apr 29, 2014
Publication Date Apr 24, 2004
Deposit Date Apr 11, 2018
Publisher Association for Computing Machinery (ACM)
Pages 1119-1122
Book Title CHI EA '04 CHI '04 Extended Abstracts on Human Factors in Computing Systems
Chapter Number 1
ISBN 1581137036
DOI https://doi.org/10.1145/985921.986003
Keywords Heuristic evaluation,
Public URL http://researchrepository.napier.ac.uk/Output/1150128