Artificial Intelligence vs Mental Health: The Problem with Perfection with Varun Karnik

Published 25 November 2024

Varun Karnik

Medical student | Clinical researcher | Non-profit co-founder

LinkedIn: Varun Karnik

The gap between human knowledge and artificial intelligence (‘AI’) is closing at an astonishing rate. Machine learning (‘ML’) has played an important role, allowing AI to learn and improve over time without human intervention. While AI machines can mimic human reasoning and cognitive function, our unique ‘humanness’ remains distinct. Despite this, there is overwhelming evidence to show that AI machines can out-perform humans in specific tasks, primarily those involving complex cognitive aspects of intelligence. The question then becomes: how far will this superiority go, and which industries will see the greatest change?

The mental health space has seen a rise in the use of AI, as it offers a possible solution to the overwhelming shortage of mental health support. An estimated 20% of the Australian population suffers from mental health disorders in any given year, with the World Health Organisation suggesting up to 55% of affected individuals in developed countries remain unsupported (1, 2). Conversational bots, or chatbots, are complex computer programs that can deliver text-based mental health therapy to human users. Decision-tree chatbots are most commonly used, and have the ability to generate accurate responses with the help of vast data banks and predefined rules. More advanced AI chatbots are becoming prevalent and can efficiently build rapport by enabling users to control dialogue through a combination of machine learning and natural language processing (‘NLP’).

Click to enlarge the image below.

Diagram highlighting the difference between decision tree/keyword chatbots and AI powered chatbots.

Decision tree/keyword chatbots vs AI powered chatbots.

Empirical evidence regarding the effectiveness of chatbots is scant. Studies have demonstrated chatbots are highly accepted and usable, with added benefits of anonymity, objectivity, and constant access (3-5). However, chatbots have not yet been shown to improve clinical outcomes, such as symptoms of depression and anxiety.

Along with limited clinical evidence, mental health support chatbots have other issues. First, while conversations between a human therapist and patient are subject to Australian Law under the Privacy Act 1988 (Cth), similar data collected by mental health chatbots might not be protected. One of the better known chatbots, Woebot, was criticised for its poor handling of sensitive data – allegedly allowing Facebook to access patient conversational information. Second, regulation poses another problem, with the Therapeutic Goods Administration (‘TGA’) confirming that software-based medical devices are excluded from regulatory oversight if they involve digital implementations of existing clinical practices (6). The result is that most chatbots remain completely unregulated and without supervision from any regulatory body. In the absence of a standard method of verifying the safety and appropriateness of these mental health support tools, the vulnerable population that uses them are potentially placed in great danger. Finally, the lack of a ‘human’ element introduces prominent safety risks. Chatbots often have limited conversational ability, meaning that responses are shallow or confusing. Alarmingly, some chatbots inadequately react to emergencies, with a study indicating only 23% of chatbots respond quickly and aptly to suicidal ideations (7). The most important factor contributing to successful counselling is the therapeutic relationship between therapist and patient – characterised by genuine safety, trust and comfort. A secure relationship with a counsellor increases the likelihood of the patient disclosing relevant information about their experience and can increase the effectiveness of care. No significant advancement in the functionality of AI will be able to replace a productive and working patient-therapist relationship, and the lack of meaningful interpersonal interaction and direct social support will always be a limitation of chatbot potential.

Nevertheless, chatbots do have an important role to fill and the conversation around their use needs to change. Chatbots are not to be seen as replacements, but rather as part of the solution for those with limited access to mental health services. For someone on a waiting list, these chatbots may be able to provide early interim intervention prior to receiving professional attention. We are still in the early days for chatbots and we will undoubtedly see substantial improvements and advancements in their abilities. In the future, advanced AI chatbots may be able to draw on immense data and processing power to give optimal and accurate responses to patients. Still, we must remember the key elements of human therapy that are irreplaceable: genuine empathy, a nuanced understanding of human emotion and meaningful relationships. Even if an AI machine can give perfect responses, humans won’t fully relate to them – perfection simply isn’t human

References

  1. Anthes E. Mental health: There’s an app for that. Nature. 2016;532(7597):20-3.
  2. Statistics ABo. National Study of Mental Health and Wellbeing: ABS; 2020-21 [Available from: https://www.abs.gov.au/statistics/health/mental-health/national-study-mental-health-and-wellbeing/2020-21.
  3. Vaidyam AN, Wisniewski H, Halamka JD, Kashavan MS, Torous JB. Chatbots and Conversational Agents in Mental Health: A Review of the Psychiatric Landscape. Can J Psychiatry. 2019;64(7):456-64.
  4. Abd-Alrazaq AA, Rababeh A, Alajlani M, Bewick BM, Househ M. Effectiveness and Safety of Using Chatbots to Improve Mental Health: Systematic Review and Meta-Analysis. J Med Internet Res. 2020;22(7):e16021.
  5. Abd-Alrazaq AA, Alajlani M, Ali N, Denecke K, Bewick BM, Househ M. Perceptions and Opinions of Patients About Mental Health Chatbots: Scoping Review. J Med Internet Res. 2021;23(1):e17828.
  6. Therapeutic Goods Administration. Regulatory changes for software based medical devices. Australian Government Department of Health; 2021.
  7. Singh K, Drouin K, Newmark LP, Lee J, Faxvaag A, Rozenblum R, et al. Many Mobile Health Apps Target High-Need, High-Cost Populations, But Gaps Remain. Health Aff (Millwood). 2016;35(12):2310-8.

 

Emerging leaders in digital health