Abstract
Introduction: Chatbots have revolutionized customer support, offering round-the-clock assistance. This research addresses the need for improving chatbot dialogues, considering the challenges faced in the quality assurance of chatbot content (utterances). Many chatbots may use pre-scripted utterances (conversational dialogue designs) that have been handcrafted by domain experts. For example, a healthcare chatbot may comprise of utterances that have been pre-scripted (pre-designed) by healthcare professionals. This deterministic approach avoids providing an AI- generated response that could include potentially harmful advice. This paper presents an intelligent tool that can be used to assess the quality of handcrafted dialogue designs before they are included into the knowledge base of a chatbot.
Methods and results: An interactive tool was developed to allow a user to import a handcrafted dialogue dataset and to then define compliance rules (e.g. utterances should have a certain sentiment score, readability score, utterance length and perhaps emoji use). These rules are then used to discover utterances that violate these rules which allow conversational designers to review and perhaps edit these utterances to provide a better user experience whilst quality assuring the dialogue design. Our tool effectively analyzes and visualizes dialogue characteristics, allowing users to identify flagged issues. Users can edit flagged utterances for improvement, promoting better dialogue quality.
Conclusion: This research contributes to conversational design and chatbot development, offering an innovative tool to enhance the quality of the chatbot ensuring consistent and engaging customer experiences.
Methods and results: An interactive tool was developed to allow a user to import a handcrafted dialogue dataset and to then define compliance rules (e.g. utterances should have a certain sentiment score, readability score, utterance length and perhaps emoji use). These rules are then used to discover utterances that violate these rules which allow conversational designers to review and perhaps edit these utterances to provide a better user experience whilst quality assuring the dialogue design. Our tool effectively analyzes and visualizes dialogue characteristics, allowing users to identify flagged issues. Users can edit flagged utterances for improvement, promoting better dialogue quality.
Conclusion: This research contributes to conversational design and chatbot development, offering an innovative tool to enhance the quality of the chatbot ensuring consistent and engaging customer experiences.
Original language | English |
---|---|
Title of host publication | Proceedings of 37th International BCS Human-Computer Interaction Conference (BCS HCI 24) |
Publisher | BCS Learning & Development Ltd |
Pages | 1-6 |
Number of pages | 6 |
DOIs | |
Publication status | Published (in print/issue) - 17 Jul 2024 |
Event | 37th International BCS Human-Computer Interaction Conference - Preston, United Kingdom Duration: 15 Jul 2024 → 17 Jul 2024 Conference number: 37 |
Publication series
Name | eWIC |
---|---|
Publisher | BCS |
ISSN (Electronic) | 1477-9358 |
Conference
Conference | 37th International BCS Human-Computer Interaction Conference |
---|---|
Abbreviated title | BHCI-2025 |
Country/Territory | United Kingdom |
City | Preston |
Period | 15/07/24 → 17/07/24 |
Keywords
- Chatbot development
- UX
- User experience
- Conversational design
- Natural language processing
- Sentiment analysis
- Complexity assessment
- Emoji validation
- Utterance length