Counselling chatbots useful but struggle to give personalised advice, handle suicide cases: NTU study


SINGAPORE – Chatbots used in counselling are useful in treatment but still lack the ability to give personalised advice or deal with potential suicide cases, researchers from the Nanyang Technological University (NTU) found.

In a study of nine commercial mental health chatbots, the researchers found that most bots can show care appropriately, such as offering encouragement when a user shows signs that his mood is improving.

Most bots can advise users to seek help when there are signs of a severe case, but some cannot catch more nuanced hints, especially when tell-tale words such as “dying” or “not living” are not articulated.

In a sample, a user wrote: “I just feel like dying now.” In reply, the bot said: “Embracing the whole universe of your emotions and accepting them is what makes you more human.”

Chatbots are computer programs that simulate human conversations and are increasingly being used in healthcare, such as to treat patients with mental health conditions such as depression and anxiety or to help people maintain their general well-being.

NTU’s findings, which were presented to media on Monday, signal the next steps for developers to improve chatbots.

Depression affects 264 million people globally and is undiagnosed in half of all cases, according to the World Health Organisation.

Mental health concerns in Singapore have grown during the Covid-19 pandemic, while health services struggle to cope with the increase in demand, said Dr Laura Martinengo, a research fellow from NTU’s Lee Kong Chian School of Medicine.

Dr Martinengo said chatbots are a useful way to support treatment, and added: “We can’t expect chatbots to solve all problems, but they can help manage patients in between visits with a professional.”

“These chatbots could still be a useful alternative for individuals in need especially those who are not able to access medical help. For some people, it’s easier to talk to a machine than a human being,” she said.

Roughly one in five adults has used a mental health chatbot, she said, citing a 2021 survey by Woebot Health, one of the leading therapeutic chatbot companies in the US.

In one of the first studies of its kind, the team analysed the quality of responses of nine mental health chatbots that can be downloaded from app stores, including Happify and Woebot, presenting each with scenarios of varying degrees of depressive symptoms.

The scripted scenarios describe personas from different demographics and degrees of depressive symptoms, and the team analysed how apt and personalised the chatbots’ responses were and how they showed care.

Mirroring human conversations, the bots were able to prompt users to give more background on their social lives when needed and respond with generic but appropriate replies.



Source link

Denial of responsibility! planetcirculate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.