Login / Signup

Can Your Phone Be Your Therapist? Young People's Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support.

Kira KretzschmarHolly TyrollGabriela PavariniArianna ManziniIlina Singhnull null
Published in: Biomedical informatics insights (2019)
Over the last decade, there has been an explosion of digital interventions that aim to either supplement or replace face-to-face mental health services. More recently, a number of automated conversational agents have also been made available, which respond to users in ways that mirror a real-life interaction. What are the social and ethical concerns that arise from these advances? In this article, we discuss, from a young person's perspective, the strengths and limitations of using chatbots in mental health support. We also outline what we consider to be minimum ethical standards for these platforms, including issues surrounding privacy and confidentiality, efficacy, and safety, and review three existing platforms (Woebot, Joy, and Wysa) according to our proposed framework. It is our hope that this article will stimulate ethical debate among app developers, practitioners, young people, and other stakeholders, and inspire ethically responsible practice in digital mental health.
Keyphrases
  • mental health
  • mental illness
  • decision making
  • primary care
  • machine learning
  • deep learning
  • healthcare
  • high throughput
  • physical activity
  • big data
  • health information