Abstract
A large number of determined predators are trawling through online platforms that
children use, hunting for particular online behaviours or particular responses from
the many approaches they make to children.
Technology Assisted Child Sexual Abuse (TACSA) victims are often already victims of
other types of abuse. Known risk factors include being isolated, having depressive
symptoms, low self-esteem, a minority sexual identity, being neglected, or being
known to child protection services. Problematic peer influence, family disfunction
or a poor connection with parents and carers can exacerbate the risk. But it is
clear that all children and young people are at risk, and it is unclear that educative
interventions with children have a significant impact.
Children and young people rarely disclose TACSA because of shame, fear of abusive
images being distributed, or a lack of understanding of the abuse. Disclosure of
TACSA is more likely when children have trust in significant adults but disclosure
rates are very low regardless; victims are more likely to disclose to peers.
The primary recommendation from this review is to hold online platforms to account
for an environment in which abusers can easily engage with children using fictitious
online profiles, safe in the knowledge that their grooming activities are protected by
end-to-end data encryption. The full implementation, regulation and enforcement of
Online Safety Act, 2023 would be a welcome step forward.
children use, hunting for particular online behaviours or particular responses from
the many approaches they make to children.
Technology Assisted Child Sexual Abuse (TACSA) victims are often already victims of
other types of abuse. Known risk factors include being isolated, having depressive
symptoms, low self-esteem, a minority sexual identity, being neglected, or being
known to child protection services. Problematic peer influence, family disfunction
or a poor connection with parents and carers can exacerbate the risk. But it is
clear that all children and young people are at risk, and it is unclear that educative
interventions with children have a significant impact.
Children and young people rarely disclose TACSA because of shame, fear of abusive
images being distributed, or a lack of understanding of the abuse. Disclosure of
TACSA is more likely when children have trust in significant adults but disclosure
rates are very low regardless; victims are more likely to disclose to peers.
The primary recommendation from this review is to hold online platforms to account
for an environment in which abusers can easily engage with children using fictitious
online profiles, safe in the knowledge that their grooming activities are protected by
end-to-end data encryption. The full implementation, regulation and enforcement of
Online Safety Act, 2023 would be a welcome step forward.
| Original language | English |
|---|---|
| Number of pages | 101 |
| Publication status | Accepted/In press - 30 Jan 2026 |
Funding
The authors would like to thank the Safeguarding Board for Northern Ireland (SBNI) for funding this important and timely research,
Fingerprint
Dive into the research topics of 'Understanding Risk, Barriers and Facilitators to Reporting Technology Assisted Child Sexual Abuse'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver