This transcript has been edited for clarity.
Hi. I'm Art Caplan. I'm at the Division of Medical Ethics at New York University Grossman School of Medicine.
Can you use a robot or an algorithm to treat and prevent suicide? What role should these new intelligent programs and algorithms, like ChatGPT, which has emerged as something that people are using to write articles and for many other purposes online — play in mental health?
This issue came up recently in a very important way when one of the social media platforms called Discord, which has subgroups with common interests — some of whom are there because they have problems with mental illness or they feel suicidal — offered to work with a platform called Coco, which was experimenting with an artificial intelligence chat bot that could interact with people online and, in a way, provide them support, counseling, or help for their suicide ideation or mental illness problems.
I think this is not yet ready for prime time, and I think there are some really serious ethical issues that we have to pay attention to before we assign, if you will, robots using artificial intelligence to become our mental health care providers, or for that matter, to be our primary care providers.
COMMENTARY
Can Artificial Intelligence Chat Bots Help Prevent Suicide?
Arthur L. Caplan, PhD
DisclosuresMarch 06, 2023
This transcript has been edited for clarity.
Hi. I'm Art Caplan. I'm at the Division of Medical Ethics at New York University Grossman School of Medicine.
Can you use a robot or an algorithm to treat and prevent suicide? What role should these new intelligent programs and algorithms, like ChatGPT, which has emerged as something that people are using to write articles and for many other purposes online — play in mental health?
This issue came up recently in a very important way when one of the social media platforms called Discord, which has subgroups with common interests — some of whom are there because they have problems with mental illness or they feel suicidal — offered to work with a platform called Coco, which was experimenting with an artificial intelligence chat bot that could interact with people online and, in a way, provide them support, counseling, or help for their suicide ideation or mental illness problems.
I think this is not yet ready for prime time, and I think there are some really serious ethical issues that we have to pay attention to before we assign, if you will, robots using artificial intelligence to become our mental health care providers, or for that matter, to be our primary care providers.
Medscape Business of Medicine © 2023
Cite this: Can Artificial Intelligence Chat Bots Help Prevent Suicide? - Medscape - Mar 06, 2023.
Tables
Authors and Disclosures
Authors and Disclosures
Author
Arthur Caplan, PhD
Director, Division of Medical Ethics, New York University Langone Medical Center, New York, NY
Disclosure: Arthur L. Caplan, PhD, has disclosed the following relevant financial relationships:
Served as a director, officer, partner, employee, advisor, consultant, or trustee for: Johnson & Johnson's Panel for Compassionate Drug Use (unpaid position)
Serves as a contributing author and advisor for: Medscape