Abstract
Chatbots are becoming increasingly popular as a human-computer interface. The traditional best practices normally applied to User Experience (UX) design cannot easily be applied to chatbots, nor can conventional usability testing techniques guarantee accuracy. WeightMentor is a bespoke self-help motivational tool for weight loss maintenance. This study addresses the following four research questions: How usable is the WeightMentor chatbot, according to conventional usability methods?; To what extend will different conventional usability questionnaires correlate when evaluating chatbot usability?; And how do they correlate to a tailored chatbot usability survey score?; What is the optimum number of users required to identify chatbot usability issues?; How many task repetitions are required for a first-time chatbot users to reach optimum task performance (i.e. efficiency based on task completion times)? This paper describes the procedure for testing the WeightMentor chatbot, assesses correlation between typical usability testing metrics, and suggests that conventional wisdom on participant numbers for identifying usability issues may not apply to chatbots. The study design was a usability study. WeightMentor was tested using a pre-determined usability testing protocol, evaluating ease of task completion, unique usability errors and participant opinions on the chatbot (collected using usability questionnaires). WeightMentor usability scores were generally high, and correlation between questionnaires was strong. The optimum number of users for identifying chatbot usability errors was 26, which challenges previous research. Chatbot users reached optimum proficiency in tasks after just one repetition. Usability test outcomes confirm what is already known about chatbots - that they are highly usable (due to their simple interface and conversation-driven functionality) but conventional methods for assessing usability and user experience may not be as accurate when applied to chatbots.
Original language | English |
---|---|
Title of host publication | ECCE 2019 Proceedings of the 31st European Conference on Cognitive Ergonomics |
Subtitle of host publication | ''Design for Cognition'' |
Publisher | Association for Computing Machinery |
Pages | 207-214 |
Number of pages | 8 |
ISBN (Electronic) | 9781450371667 |
ISBN (Print) | 978-1-4503-7166-7 |
DOIs | |
Publication status | Published (in print/issue) - 10 Sept 2019 |
Event | 31st European Conference on Cognitive Ergonomics: Design for Cognition - Belfast, United Kingdom Duration: 10 Sept 2019 → 13 Sept 2019 https://www.ulster.ac.uk/conference/european-conference-on-cognitive-ergonomics |
Publication series
Name | Proceedings of the 31st European Conference on Cognitive Ergonomics |
---|---|
Publisher | Association for Computing Machinery |
Conference
Conference | 31st European Conference on Cognitive Ergonomics |
---|---|
Abbreviated title | ECCE 2019 |
Country/Territory | United Kingdom |
City | Belfast |
Period | 10/09/19 → 13/09/19 |
Internet address |
Bibliographical note
Had confirmation from Raymond that no Embargo applies as far as he is aware.Publisher Copyright: © 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM. Copyright: Copyright 2019 Elsevier B.V., All rights reserved.
Keywords
- Usability Testing
- Chatbots
- Conversational UI
- UX Testing