Next Article in Journal
An Open-Access Dataset of Hospitalized Cardiac-Arrest Patients: Machine-Learning-Based Predictions Using Clinical Documentation
Next Article in Special Issue
The Bioinformatics Identification of Potential Protein Glycosylation Genes Associated with a Glioma Stem Cell Signature
Previous Article in Journal
BioMedInformatics, the Link between Biomedical Informatics, Biology and Computational Medicine
Previous Article in Special Issue
Avatar Intervention for Cannabis Use Disorder in a Patient with Schizoaffective Disorder: A Case Report
 
 
Article
Peer-Review Record

Supporting the Demand on Mental Health Services with AI-Based Conversational Large Language Models (LLMs)

BioMedInformatics 2024, 4(1), 8-33; https://doi.org/10.3390/biomedinformatics4010002
by Tin Lai *, Yukun Shi, Zicong Du, Jiajie Wu, Ken Fu, Yichao Dou and Ziqi Wang
Reviewer 1: Anonymous
Reviewer 2:
BioMedInformatics 2024, 4(1), 8-33; https://doi.org/10.3390/biomedinformatics4010002
Submission received: 13 July 2023 / Revised: 22 August 2023 / Accepted: 7 December 2023 / Published: 22 December 2023

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The paper presents an AI-based chatbot for mental health counseling, termed Psy-LLM. While the model demonstrates competence in language processing and generating responses to users' inquiries, it falls significantly short when considered for Mental Health Psychological Services.

One major drawback of the Psy-LLM framework is its lack of compassion and empathy, essential aspects of mental health counseling. The responses provided by the chatbot appear standardized and devoid of human understanding or emotional connection. Mental health support requires a high level of care and compassion, something that an AI-based system like Psy-LLM cannot provide. Patients often seek counseling not only for information but also for emotional support and understanding, which the system fails to offer.

Additionally, Psy-LLM is ill-equipped for long-term care monitoring, a crucial aspect of mental health treatment. Long-term mental health support involves tracking patients' progress over time, understanding their evolving needs, and adapting treatment plans accordingly. The chatbot lacks the ability to maintain continuity of care and personalized monitoring, which are essential for effective mental health support.

Furthermore, the limitations in data collection and model improvement affect the reliability and accuracy of the chatbot's responses. The quality of the training dataset is questionable, leading to responses that may lack logical coherence or provide inaccurate advice. The model's shortcomings, including exposure bias and unidirectional training, hinder its ability to comprehend contextual information, making it unsuitable for nuanced psychological counseling.

While the online consultation service may seem convenient and accessible, it fails to address the complex and sensitive nature of mental health support. The deployment of an emotion recognition system may attempt to identify distressed users, but it cannot replace the genuine empathy and understanding that human counselors provide.

Ethical considerations and user privacy are paramount in mental health counseling, and Psy-LLM falls short in ensuring proper safeguards. Trusting an AI-based system with sensitive psychological information raises concerns about data security and confidentiality.

In conclusion, while Psy-LLM may be fine for language processing tasks, it is not suitable for Mental Health Psychological Services. The standardized and emotionally detached responses, coupled with the lack of long-term care monitoring, make it an inadequate substitute for human counselors. The system's limitations in data quality, model improvement, user experience, and ethical considerations further undermine its effectiveness in providing genuine and comprehensive mental health support. Patients in need of mental health counseling deserve the empathy, care, and personalized attention that only human counselors can provide.

Comments on the Quality of English Language

Not for mental health 

Author Response

Please see the attached pdf for our response to address the review comments, thanks.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

Dear Mr. Author,

I thoroughly enjoyed reading your paper on Scaling up global mental health psychological services with AI-based language models. The study's goal is to create an AI-based psychological support system to assist individuals in improving their well-being. To develop the question-answerin language model, the author compares two models: the PanGu Model and the WenZhong Model. The current work is interesting and rigorous in nature, but after reading the manuscript, I discovered some conflicting points that can be worked on to improve the work's outcome.

1. The work's abstract states that the goal of the paper is to develop a Psy-LLM framework that can be used as a tool by health care professionals; however, the author refers to this system as a better replacement for human counselling (conventional mode) and also a way to improve emotional well-being and address anxiety and depression. Because it is unclear whether the authors are assisting a mental health professional or providing a psychoeducation platform to the patient, the authors can keep the goal consistent throughout.

2. There is no mention of the users' specificity. On page two, it is stated that this is intended to help with depression and anxiety, but nowhere else in the article is it stated how it will help in these cases or what the specific inclusion criteria for individuals using the platforms will be.

3. The authors claim on pages two and four that this platform can replace the traditional approach of counselling with one counsellor, and that it may be a better approach because the counsellor's unfamiliar face to the individual can be a hindrance in effective couselling. The importance of nonverbal cues, as well as counsellor and rappot building phases, is overlooked in this case. The current system may be presented as the best tool in the absence of available psychological help, but the authors' claim of a replacement may be considered exaggeration. 

4. The data set cleanup is excellent and demonstrates the rigour of the work. 

5. The comparison of the two models is also very good because they use different methods to collect data.

6. The current study lacks a user experience in which prospective users can interact with the system and provide more accurate answers about their well-being.

7. The example of an answer shown in the Pangu model with six lines is good, and more examples can be added, but the response second and third lines show inference without evidence. Nowhere in real counselling is such a statement made without rapport building and case study input. As a result, it is critical for authors to understand the distinction between real-world counselling and AI-based counselling setups. 

8. The limitations presented are very good and present a clearer picture of the model, but they differ greatly from the proposal. It is critical for the author to approach it as an exploratory study and present the results in raw form before presenting as an inference from the model.

I appreciate the time and effort put into such research. The study advances current AI research in the mental health field, but it is equally important to be cautious when making claims and comparing it to real-world counselling settings. My recommendations are that the current model be used under the supervision of a trained counsellor as a good psychoeducational tool. A greater number of expert inputs can improve and extend the current model's capacity.

 I wish the authors all the best. 

Author Response

Please see the attached pdf for our response to address your comments, thanks.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

Comments and Suggestions for Authors

Dear Author,

The current version of manuscript clearly incorporated the changes suggested by the reviewer. 

Best wishes!

Back to TopTop