Awareness and views of healthcare professionals towards mental health chatbots

Research output: Contribution to conferenceAbstractpeer-review


Digital technologies such as chatbots are becoming increasingly popular in mental health. Trust is an important consideration for the adoption of technology in the mental health domain, as health professionals must be confident in their recommendations to clients. Thus, the aim of this study is to explore views of healthcare professionals towards the use of mental health chatbots.

Mental healthcare professionals (n=218) were surveyed about their awareness, attitudes, and practices relating to chatbots. Answers to survey questions were compared pre-covid (n=131, responses collected before 12th March 2020) to active/ post-covid (n=87, responses collected from 12th March 2020 onwards).

Healthcare professionals were asked if they believed their clients used mental health chatbots. One quarter said yes (23%), one third said no (32%) and others were unsure. Answers to this question were similar throughout the duration of the survey, with no statistically significant difference in answers pre-covid compared to active/ post-covid (W=3553, p=0.85). Healthcare professionals had more personal experience with chatbots for physical health (18%) compared to mental health (9%). Pre-covid, 15% of healthcare professionals had used chatbots for physical health and 6% had used mental health chatbots, while post covid these figures rose to 22% and 14% however these increases were not statistically significant for physical health (X2=1.04, df=1, p=0.31) and mental health (X2=2.21, df=1, p=0.13) respectively. Overall, the majority (74%) of professionals believed that chatbots are very or somewhat important in mental health care, while the remaining 26% felt they are somewhat or not very important. Comparing pre-covid to active/post-covid, the percentage of those who thought chatbots were very or somewhat important rose from 73% to 77%, however this change was not statistically significant (W=3470, p=0.66).

Healthcare professionals were asked how often they recommend apps to a client. Pre-covid, the responses were; 3% very frequently, 14% frequently, 47% sometimes, 36% never. In comparison, active/post-covid responses were statistically significantly different (W=4693, p=0.039), favouring more positive responses; 6% very frequently, 15% frequently, 60% sometimes, 19% never.

When asked if professionals would be likely to prescribe chatbots to clients in the next 5 years, one fifth of participants said they would be very likely, just over half said they would be somewhat likely, and a quarter thought they would be somewhat or very unlikely. The likelihood of professionals to prescribe these technologies did not change throughout the duration of the study.

Anxiety, depression and stress were the top three disorders in which >70% healthcare professionals thought chatbots had some or significant potential to target, both pre-covid and active/post-covid. Conversely, healthcare professionals thought that there would be potentially more problems with chatbots for dementia and schizophrenia.

Overall, perceptions were generally positive, and support was greatest for chatbots that target stress, anxiety and/or depression. Attitudes towards the use of mental health chatbots are shifting, however confidence in prescribing these technologies is not quite there yet. It is important to target chatbots at the use cases professionals endorse. Future work should look at building up the evidence base in support of mental health chatbots.
Original languageEnglish
Publication statusPublished (in print/issue) - 4 Jul 2023
Event International Digital Mental Health & Wellbeing Conference - Ulster University, Belfast, United Kingdom
Duration: 21 Jun 202323 Jun 2023
Conference number: 1


Conference International Digital Mental Health & Wellbeing Conference
Country/TerritoryUnited Kingdom
Internet address


  • chatbots
  • mental health


Dive into the research topics of 'Awareness and views of healthcare professionals towards mental health chatbots'. Together they form a unique fingerprint.

Cite this