Can Artificial Intelligence Chat Bots Help Prevent Suicide?
This site is intended for healthcare professionals

COMMENTARY

Can Artificial Intelligence Chat Bots Help Prevent Suicide?

Arthur L. Caplan, PhD

Disclosures

March 06, 2023

11

This transcript has been edited for clarity.

Hi. I'm Art Caplan. I'm at the Division of Medical Ethics at New York University Grossman School of Medicine.

Can you use a robot or an algorithm to treat and prevent suicide? What role should these new intelligent programs and algorithms, like ChatGPT, which has emerged as something that people are using to write articles and for many other purposes online — play in mental health?

This issue came up recently in a very important way when one of the social media platforms called Discord, which has subgroups with common interests — some of whom are there because they have problems with mental illness or they feel suicidal — offered to work with a platform called Coco, which was experimenting with an artificial intelligence chat bot that could interact with people online and, in a way, provide them support, counseling, or help for their suicide ideation or mental illness problems.

I think this is not yet ready for prime time, and I think there are some really serious ethical issues that we have to pay attention to before we assign, if you will, robots using artificial intelligence to become our mental health care providers, or for that matter, to be our primary care providers.

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.

processing....