Evaluation beyond usability: Validating sustainable HCI research

Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, Elaine M. Huang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

41 Citations (Scopus)

Abstract

The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.

Original languageEnglish
Title of host publicationCHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems
Subtitle of host publicationEngage with CHI
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450356206, 9781450356213
DOIs
Publication statusPublished - 21 Apr 2018
Externally publishedYes
Event2018 CHI Conference on Human Factors in Computing Systems, CHI 2018 - Montreal, Canada
Duration: 21 Apr 201826 Apr 2018

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume2018-April

Conference

Conference2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Country/TerritoryCanada
CityMontreal
Period21/04/1826/04/18

Keywords

  • Evaluation
  • Sustainability
  • Sustainable HCI
  • Validation

Cite this