A user experience methodological framework and dashboard for the measurement and scoring of dynamic, adaptive and intelligent aspects of a software solution with recommendations for enhancement

Student thesis: Doctoral Thesis

Abstract

User Experience (UX) evaluation methods tend to be used at the end of the software development process, known as a summative evaluation. However, there are few UX evaluations that focus on the early stages of software development. By evaluating a software solution during development and post-use, a valuable understanding of the UX can be achieved. There is a need for a flexible and transparent UX evaluation that can be used continuously throughout the development lifecycle of a software solution.

This thesis presents a methodological framework that is encapsulated within an evaluative visual dashboard to assist with the visualisation of results from a UX evaluation. It has been designed to be used by a UX team which consists of: designers, developers, researchers, architects, engineers, testers; and managers. This format allows employees from a variety of areas within business to come together, measure, score and understand all areas of their software solution’s UX during any stage of its development and reduce the language barriers associated within a team environment.

Initial research focused on the impact of sentiment analysis and whether it can be used to assist with the UX enhancement of a software solution. An anonymised dataset was used, a range of sentiment was encountered and UX elements were identified. These findings formed the recommendations supplied by the framework to create Framework Version 2. The Dynamic, Adaptive and Intelligent (DAI) Visualisation Tool was created to encapsulate the framework and provide UX teams with an evaluative visual dashboard to visualise their scores in a meaningful way. Subsequent research validated the tool with three third-party companies and a large European Union (EU) research project, by evaluating their software solutions with design materials supplied; all feedback received was positive. Feedback obtained by the first two companies created version 2 of the DAI Visualisation Tool. Version 2 of the framework and the tool were evaluated by adopting usability testing with an eye tracker, the think a-loud protocol; and five participants. Valuable insights were obtained, highlighting issues surrounding the tool’s usability. In addition, a post-evaluation questionnaire and System Usability Scale (SUS) were completed by each participant and five employees from industry. To which all results assisted in resolving usability issues and enhanced the tool’s UX.
Date of AwardMay 2023
Original languageEnglish
SponsorsDepartment for the Economy
SupervisorMichaela Black (Supervisor) & Jonathan Wallace (Supervisor)

Keywords

  • Machine learning
  • Education
  • Platform
  • Computing
  • Software development team
  • Software management
  • Industrial pilots
  • Tool versioning

Cite this

'