Distributed usability evaluation: enabling large-scale usability evaluation with user-controlled instrumentation

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

We present DUE (Distributed Usability Evaluation), a technique for collecting and evaluating usability data. The DUE infrastructure involves a client-server network. A client-based tool resides on the workstation of each user, providing a screen video recording, microphone input of voice commentary, and a window for a severity rating. The idea is for the user to work naturalistically, clicking a button when a usability problem or point of uncertainty is encountered, to describe it verbally along with illustrating it on screen, and to rate its severity. These incidents are accumulated on a server, providing access to an evaluator (usability expert) and to product developers or managers who want to review the incidents and analyse them. DUE supports evaluation in the development stages from running prototypes and onwards. A case study of the use of DUE in a corporate environment is presented. The study indicates that the DUE technique is effective in terms of low bias, high efficiency, and clear communication of usability issues among users, evaluators and developers. Further, DUE is supporting long-term evaluations making possible empirical studies of learnability.
OriginalsprogEngelsk
TitelProceedings of the 6th Nordic Conference on Human-Computer Interaction : extending boundaries
Antal sider10
ForlagAssociation for Computing Machinery
Publikationsdato2010
Sider118-127
ISBN (Trykt)978-1-60558-934-3
DOI
StatusUdgivet - 2010
Begivenhed6th Nordic Conference on Human-Computer Interaction: extending boundaries - Reykjavik, Island
Varighed: 16 okt. 201020 okt. 2010
Konferencens nummer: 6

Konference

Konference6th Nordic Conference on Human-Computer Interaction
Nummer6
LandIsland
ByReykjavik
Periode16/10/201020/10/2010

    Forskningsområder

  • Det Natur- og Biovidenskabelige Fakultet - automation, beta test, case study, distributed, evaluator effect, instrumentation, learnability, remote, screen video, software industry, think-aloud, usability evaluation, voice commentary}

ID: 32962960