NORFOLK, Va. (WAVY) — Mental health help is right at your fingertips, with many people now turning to chatbots when traditional therapy is either unavailable or unwanted. But can talking with a chatbot cause more harm than good?
“It could absolutely give people tools, but the computer also needs to know when it’s met its match, and I don’t know if it can do that,” said EVMS-ODU Psychiatry Department Chair Dr. David Spiegel.
Spiegel sees the advantages of artificial intelligence, especially as America navigates a mental health epidemic with a shortage of mental health care providers. Chatbots are available 24/7 with no cost and no judgment. They also, however, have no empathy.
“I think the give-and-take of the conversation you get from the computer is not the same as give-and-take humans have with each other,” Spiegel said.
A machine may be helpful for navigating relationship issues or social anxiety, Spiegel said. They can offer mindfulness and meditation techniques.
He cautions, though, that a computer cannot prescribe medication or make observations. A chatbot operates only on the information a person puts into it, and one symptom by itself, Spiegel warns, means very little. He pours over pages and does detective work to deliver a diagnosis.
“If somebody gave me a diagnosis of cancer,” Spiegel said, “I’d want to ask them, ‘what makes you think I have cancer.”
Researchers are working on ways to improve AI so that future algorithms may even predict psychiatric episodes before they happen.
For now, Spiegel advises people to proceed with caution.
“I think, in a crisis situation,” Spiegel said, “there is no substitute for the real thing.”
If you are experiencing a mental health emergency, call or text the national mental health hotline at 988 or your doctor.