Evaluation beyond usability: Validating sustainable HCI research

Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, Elaine M. Huang

Allbwn ymchwil: Pennod mewn Llyfr/Adroddiad/Trafodion CynhadleddCyfraniad mewn cynhadleddadolygiad gan gymheiriaid

46 Dyfyniadau (Scopus)

Crynodeb

The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.

Iaith wreiddiolSaesneg
TeitlCHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
Is-deitlEngage with CHI
CyhoeddwrAssociation for Computing Machinery
ISBN (Electronig)9781450356206, 9781450356213
Dynodwyr Gwrthrych Digidol (DOIs)
StatwsCyhoeddwyd - 21 Ebr 2018
Cyhoeddwyd yn allanolIe
Digwyddiad2018 CHI Conference on Human Factors in Computing Systems, CHI 2018 - Montreal, Canada
Hyd: 21 Ebr 201826 Ebr 2018

Cyfres gyhoeddiadau

EnwConference on Human Factors in Computing Systems - Proceedings
Cyfrol2018-April

Cynhadledd

Cynhadledd2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Gwlad/TiriogaethCanada
DinasMontreal
Cyfnod21/04/1826/04/18

Dyfynnu hyn