Login / Signup

The effect of therapeutic and deterrent messages on Internet users attempting to access 'barely legal' pornography.

Jeremy PrichardRichard WortleyPaul WattersCaroline SpiranovicJoel Scanlan
Published in: Child abuse & neglect (2024)
Online child sexual abuse material (CSAM) is a growing problem. Prevention charities, such as Stop It Now! UK, use online messaging to dissuade users from viewing CSAM and to encourage them to consider anonymous therapeutic interventions. This experiment used a honeypot website that purported to contain barely legal pornography, which we treated as a proxy for CSAM. We examined whether warnings would dissuade males (18-30 years) from visiting the website. Participants (n = 474) who attempted to access the site were randomly allocated to one of four conditions. The control group went straight to the landing page (control; n = 100). The experimental groups encountered different warning messages: deterrence-themed with an image (D3; n = 117); therapeutic-themed (T1; n = 120); and therapeutic-themed with an image (T3; n = 137). We measured the click through to the site. Three quarters of the control group attempted to enter the pornography site, compared with 35 % to 47 % of the experimental groups. All messages were effective: D3 (odds ratio [OR] = 5.02), T1 (OR = 4.06) and T2 (OR = 3.05). Images did not enhance warning effectiveness. We argue that therapeutic and deterrent warnings are useful for CSAM-prevention.
Keyphrases
  • deep learning
  • health information
  • randomized controlled trial
  • systematic review
  • social media
  • healthcare
  • machine learning
  • cross sectional
  • smoking cessation
  • newly diagnosed