*3.3. The Chatbots in the Health Domain: Applications, Opportunities, Open Challenges, and Problems*

The chatbots are increasingly showing several applications in the *health domain*. Diverse keywords [35–55] are associated with the concept of the chatbot in healthcare; among the most frequent we find: *patient engagement, clinical support, mental health, health monitoring, patient education, appointment scheduling, symptom checking, chronic disease management, triage, remote monitoring, telemedicine, health coaching and emergency response.*

Chatbots can be used in several fields in the health domain [35–55]: (a) As a tool for answering frequently asked questions [39]; (b) For the collection of data and patient details [39,42,43]; (c) To support patients finding a doctor or a specific service, on managing appointments, and on the medication dispensing procedure [42]; (d) As an interactive guide to the management of self-assessment and symptom control [39,42]; (e) As a tool to guide an interactive triage, applicable in the case of an emergency as well [36]; (f) In telehealth, digital health applications, and remote monitoring [37,39,40,42,43]. (g) In the learning process, in the construction of scientific knowledge and in supporting scientific dissemination [35,38]; (h) In mental health applications [37,38]; (i) For physical wellness and health coaching [40].

From a general point of view, the use of these tools has the potential to lighten the hospital and care facility load, decentralizing many of the activities, allowing them to be carried out in a remote mode, something that during a situation such as the COVID-19 pandemic better protects all the actors involved. The patients can be more responsible, self-diagnose independently, and invited and supported to take better care of themselves also in relation to the wellness and psychological aspects.

A search was performed on PubMed with the following composite key

*(chatbot[Title/Abstract]) AND ((health [Title/Abstract]) OR (healthcare[Title/Abstract]) OR (health domain[Title/Abstract])).*

The key showed the evolution of scientific dissemination in this area. Since 2010, *370* papers have been published, including 19 systematic reviews.

The research highlights a terrific growth in the volume of publications in the last three years, coinciding with the outbreak of the pandemic, with 340 of the papers were published from 2020 to 2022, which is *91.9%* of the total papers published, and the number of papers published in *2022* is *117*, which is *31.62%* of the total papers published.

We decided to analyze the systematic reviews [37,40,45,49,51,52,54,56–67] to detect the principal patterns of interest and highlight areas where additional research is needed or where current research is insufficient to support clinical decisions to introduce these tools in the clinical routine.

The analysis of the systematic review allowed us to detect five areas of interest, which are as follows:


#### 3.3.1. Application of Chatbots in Mental Health

Five studies have investigated the application of chatbots in *mental health* [45,61,65–67]. Lim et al. [61] reviewed the effectiveness of chatbot-delivered psychotherapy in improving depressive symptoms in adults with depression or anxiety. The review highlighted that chatbot-delivered psychotherapy significantly improved depressive symptoms. The preferred features for the design of chatbots include embodiment, a combination of input and output formats, less than 10 sessions, problem-solving therapy, offline platforms, and different regions of the United States. The study concluded that chatbot-delivered psychotherapy could be an alternative treatment for depression and anxiety, and further high-quality trials were needed to confirm its effectiveness.

Ruggiano et al. [65] identified current commercially available chatbots that were designed for use by people with dementia and their caregivers, and assessed their quality in terms of features and content. Although the chatbots were generally found to be easy to use, limitations were noted regarding their performance and programmed content for dialog. The authors concluded that evidence-based chatbots were needed to adequately educate and support people with dementia and their caregivers.

Vaidyam et al. [66] reviewed the use of conversational agents (chatbots or voice asassistants) in the assessment and treatment of serious mental illnesses, such as depression, anxiety, schizophrenia, and bipolar disorder. The study highlighted positive outcomes for diagnostic quality, therapeutic efficacy, and acceptability. However, certain populations, such as pediatric patients and those with schizophrenia or bipolar disorder, were un-derrepresented in the research. The authors recommended the standardization of studies to include measures of patient adherence and engagement, therapeutic efficacy, and clinician perspectives.

Gaffney et al. [67] investigated the use of conversational agent interventions in mental health. The interventions were diverse and targeted a range of mental health problems using various therapeutic orientations. All included studies reported reductions in psychological distress post-intervention, and the controlled studies demonstrated significant reductions in psychological distress compared to inactive control groups. However, the authors concluded that a more robust experimental design was required to demonstrate efficacy and efficiency.

Hoermann et al. [45] analyzed the feasibility and effectiveness of one-on-one mental health interventions that used chatbots. The interventions showed significant improvements compared to waitlist conditions, but were not superior to the usual treatment. The study also found substantial innovation in the use of trained volunteers and chatbot technologies. However, further research was needed to determine the feasibility of this mode of intervention in clinical practice.

3.3.2. Application of Chatbots in the Domain of the Addiction

The field of addiction was dealt with in three studies [49,57,62].

Aggarwal et al. [49] evaluated the feasibility, efficacy, and characteristics of AI chatbots for promoting health behavior change. The review found that AI chatbots have shown high efficacy in promoting healthy lifestyles, smoking cessation, treatment or medication adherence, and reduction in substance misuse. However, there were mixed results regarding feasibility, acceptability, and usability. Furthermore, the authors concluded that the reported results needed to be interpreted with caution due to limitations in internal validity, insufficient description of AI techniques, and limited generalizability. Future studies should adopt robust randomized control trials to establish definitive conclusions.

He et al. [57] also investigated conversational agents for smoking cessation. The systematic review and meta-analysis found that all studies reported positive effects on cessation-related outcomes. Meta-analyses of randomized controlled trials showed that conversational agents were more effective in promoting abstinence compared to control groups. However, the included studies were diverse in design, and evidence of publication bias was identified. The review also highlighted a lack of theoretical foundations and a need for relational communication in future designs. The standardization of reporting on and designing conversational agents was warranted for a more comprehensive evaluation. Overall, this review provided insights into the potential of conversational agents for smoking cessation and the need for further research and development to improve their effectiveness and acceptability.

Ogilvie et al. [62] researched the use of chatbots in the field of addiction, specifically as supportive agents for those with a substance use disorder. The findings suggested that the corpus of the research in this field is limited, and more research was needed to confidently report on the usefulness of chatbots in this area. While some papers reported a reduction in substance use in participants, caution was advised as expert input was needed to safely leverage existing data and avoid potential harm to the intended audience.
