Abstract
Background: It is predicted that medical imaging services will be greatly impacted by AI in the future. Developments in
computer vision have allowed AI to be used for assisted reporting. Studies have investigated radiologists' opinions of
AI for image interpretation (Huisman et al., 2019 a/b) but there remains a paucity of information in reporting
radiographers' opinions on this topic.
Method: A survey was developed by AI expert radiographers and promoted via LinkedIn/Twitter and professional
networks for radiographers from all specialities in the UK. A sub analysis was performed for reporting radiographers
only.
Results: 411 responses were gathered to the full survey (Rainey et al., 2021) with 86 responses from reporting
radiographers included in the data analysis. 10.5% of respondents were using AI tools? as part of their reporting role.
59.3% and 57% would not be confident in explaining an AI decision to other healthcare practitioners and 'patients and
carers' respectively. 57% felt that an affirmation from AI would increase confidence in their diagnosis. Only 3.5%
would not seek second opinion following disagreement from AI. A moderate level of trust in AI was reported: mean
score = 5.28 (0 = no trust; 10 = absolute trust). 'Overall performance/accuracy of the system', 'visual explanation
(heatmap/ROI)', 'Indication of the confidence of the system in its diagnosis' were suggested as measures to increase
trust.
Conclusion: AI may impact reporting professionals' confidence in their diagnoses. Respondents are not confident in
explaining an AI decision to key stakeholders. UK radiographers do not yet fully trust AI. Improvements are suggested.
computer vision have allowed AI to be used for assisted reporting. Studies have investigated radiologists' opinions of
AI for image interpretation (Huisman et al., 2019 a/b) but there remains a paucity of information in reporting
radiographers' opinions on this topic.
Method: A survey was developed by AI expert radiographers and promoted via LinkedIn/Twitter and professional
networks for radiographers from all specialities in the UK. A sub analysis was performed for reporting radiographers
only.
Results: 411 responses were gathered to the full survey (Rainey et al., 2021) with 86 responses from reporting
radiographers included in the data analysis. 10.5% of respondents were using AI tools? as part of their reporting role.
59.3% and 57% would not be confident in explaining an AI decision to other healthcare practitioners and 'patients and
carers' respectively. 57% felt that an affirmation from AI would increase confidence in their diagnosis. Only 3.5%
would not seek second opinion following disagreement from AI. A moderate level of trust in AI was reported: mean
score = 5.28 (0 = no trust; 10 = absolute trust). 'Overall performance/accuracy of the system', 'visual explanation
(heatmap/ROI)', 'Indication of the confidence of the system in its diagnosis' were suggested as measures to increase
trust.
Conclusion: AI may impact reporting professionals' confidence in their diagnoses. Respondents are not confident in
explaining an AI decision to key stakeholders. UK radiographers do not yet fully trust AI. Improvements are suggested.
Original language | English |
---|---|
Pages | 18 |
Publication status | Published (in print/issue) - 31 Jul 2022 |
Event | UKIO - Duration: 6 Jun 2022 → … |
Conference
Conference | UKIO |
---|---|
Period | 6/06/22 → … |