Direct Integration: Training Software Developers to Conduct Usability Evaluations

Publikation: Bidrag til tidsskriftKonferenceartikel i tidsskriftForskningpeer review

Resumé

Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out their
own usability evaluations. The paper is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting, and interpreting the results of a usability evaluation of an interactive website. They gained good competence in conducting the evaluation, defining task assignments and producing a usability report, while they were less successful in acquiring skills for identifying and describing usability problems.
OriginalsprogEngelsk
TidsskriftCEUR Workshop Proceedings
Sider (fra-til)74-81
ISSN1613-0073
StatusUdgivet - 2008
Begivenhed1st Workshop on the Interplay between Usability Evaluation and Software Development (I-USED 2008) - Pisa, Italien
Varighed: 24 sep. 200824 sep. 2008
Konferencens nummer: 1

Konference

Konference1st Workshop on the Interplay between Usability Evaluation and Software Development (I-USED 2008)
Nummer1
LandItalien
ByPisa
Periode24/09/200824/09/2008

Fingerprint

Software engineering
Websites
Software design
Education
Students
Planning
Testing

Bibliografisk note

Volumne: 407

Citer dette

@inproceedings{c892bf80e93211ddb0a4000ea68e967b,
title = "Direct Integration: Training Software Developers to Conduct Usability Evaluations",
abstract = "Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out theirown usability evaluations. The paper is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting, and interpreting the results of a usability evaluation of an interactive website. They gained good competence in conducting the evaluation, defining task assignments and producing a usability report, while they were less successful in acquiring skills for identifying and describing usability problems.",
author = "Skov, {Mikael B.} and Jan Stage",
note = "Volumne: 407",
year = "2008",
language = "English",
pages = "74--81",
journal = "CEUR Workshop Proceedings",
issn = "1613-0073",
publisher = "CEUR Workshop Proceedings",

}

Direct Integration: Training Software Developers to Conduct Usability Evaluations. / Skov, Mikael B.; Stage, Jan.

I: CEUR Workshop Proceedings, 2008, s. 74-81.

Publikation: Bidrag til tidsskriftKonferenceartikel i tidsskriftForskningpeer review

TY - GEN

T1 - Direct Integration: Training Software Developers to Conduct Usability Evaluations

AU - Skov, Mikael B.

AU - Stage, Jan

N1 - Volumne: 407

PY - 2008

Y1 - 2008

N2 - Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out theirown usability evaluations. The paper is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting, and interpreting the results of a usability evaluation of an interactive website. They gained good competence in conducting the evaluation, defining task assignments and producing a usability report, while they were less successful in acquiring skills for identifying and describing usability problems.

AB - Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out theirown usability evaluations. The paper is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting, and interpreting the results of a usability evaluation of an interactive website. They gained good competence in conducting the evaluation, defining task assignments and producing a usability report, while they were less successful in acquiring skills for identifying and describing usability problems.

M3 - Conference article in Journal

SP - 74

EP - 81

JO - CEUR Workshop Proceedings

JF - CEUR Workshop Proceedings

SN - 1613-0073

ER -