Next Article in Journal
A Review of Stage 0 Biomarkers in Type 1 Diabetes: The Holy Grail of Early Detection and Prevention?
Previous Article in Journal
Investigation of Electrocardiographic Changes in Individuals with Three or More Cardiovascular Risk Factors on Santiago Island—The Cross-Sectional PrevCardio.CV Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Chatbots in Cancer Applications, Advantages and Disadvantages: All that Glitters Is Not Gold

by
Georgios Goumas
1,*,
Theodoros I. Dardavesis
2,
Konstantinos Syrigos
3,
Nikolaos Syrigos
3,4 and
Effie Simou
1
1
Department of Public Health Policy, School of Public Health, University of West Attica, 115 21 Athens, Greece
2
Laboratory of Hygiene, Social & Preventive Medicine and Medical Statistics, School of Medicine, Faculty of Health Sciences, Aristotle University of Thessaloniki, 541 24 Thessaloniki, Greece
3
Oncology Unit, 3rd Department of Medicine, “Sotiria” Hospital for Diseases of the Chest, National and Kapodistrian University of Athens, 115 27 Athens, Greece
4
Dana-Farber Cancer Institute, Boston, MA 02215, USA
*
Author to whom correspondence should be addressed.
J. Pers. Med. 2024, 14(8), 877; https://doi.org/10.3390/jpm14080877
Submission received: 3 July 2024 / Revised: 12 August 2024 / Accepted: 14 August 2024 / Published: 19 August 2024
(This article belongs to the Section Personalized Therapy and Drug Delivery)

Abstract

:
The emergence of digitalization and artificial intelligence has had a profound impact on society, especially in the field of medicine. Digital health is now a reality, with an increasing number of people using chatbots for prognostic or diagnostic purposes, therapeutic planning, and monitoring, as well as for nutritional and mental health support. Initially designed for various purposes, chatbots have demonstrated significant advantages in the medical field, as indicated by multiple sources. However, there are conflicting views in the current literature, with some sources highlighting their drawbacks and limitations, particularly in their use in oncology. This state-of-the-art review article seeks to present both the benefits and the drawbacks of chatbots in the context of medicine and cancer, while also addressing the challenges in their implementation, offering expert insights on the subject.

1. Introduction

In the 21st century, healthcare has undergone significant modifications due to the increasing number of patients with chronic medical conditions, the breakdown of the traditional hierarchy in medicine, easier access to new technologies, medical knowledge, and peer support online. This shift in culture, known as “digital health”, is becoming more evident. The roles of both patients and healthcare providers have also changed. Patients are now taking on a more proactive, empowered role and want to be active participants in their care. These “empowered patients”, also known as electronic patients (e-patients), are knowledgeable about managing their health or diseases, have access to information and technologies, and use electronic gadgets to collect data. Similar to this, the doctors’ roles are evolving into the “empowered physician or electronic physician (e-physician) who helps patients navigate the maze of digital information rather than acting as the gatekeepers to the ivory tower of medicine” [1].
Digital technologies, such as big data, the Internet of Things, virtual and augmented reality, smartphone and other apps, artificial intelligence (AI), social media, and chatbots, use synergy of computing platforms, connectivity, software, and sensors for healthcare and related uses, and they are currently constantly being applied to the medical and health fields, and have given new vitality to the evolution of what is called medical health [2]. Particularly, after the so-called coronavirus disease 2019 (COVID-19) pandemic, digital health has seen an astounding transformation [1,3]. The most significant change in this transition, though, is how digital health technologies put the patient at the center of care, enabling them to receive medical care or diagnosis wherever they are, whereas now, they take a more active role in their care, thanks to health monitors and mobile diagnostic gadgets that measure blood pressure, electrocardiography, fitness activities, and sleep quality, giving them access to data that previously were exclusively in the doctor’s files [1]. The ability to have natural language interactions with users through a variety of communication channels has made chatbots an indispensable component of our everyday life, and undeniably, the promotion of health and wellbeing could be greatly aided by such technology, yet various concerns have arisen towards the diagnosis of critical pathologies, such as cancer [4].
According to WHO declarations, cancer cases are projected to increase by 77% in 2050, with an estimated 35 million people having a history of cancer. Research data from 115 countries have shown that the majority of populations worldwide do not adequately prioritize cancer and palliative care services as a part of global health coverage. Only 39% of them cover basic cancer management, including pain management at a broader level, highlighting disparities in cancer care worldwide [5]. The International Agency for Research on Cancer (IARC) estimates that ten cancer types comprise about two-thirds of new cases and deaths, with lung, breast, and colorectal cancer being the three major types of cancer at a global level in 2022 [5]. However, receiving a cancer diagnosis seems to be much harder than the cancer itself. In this state-of-the-art review, the types, advantages, disadvantages, and challenges of novel chatbots in medicine, particularly in oncology, are thoroughly illustrated. Furthermore, expert opinions on the topic are thoroughly discussed.
In this novel and state-of-the-art review, the types, the advantages and disadvantages, and the challenges of the novel chatbots in medicine and particularly in oncology are thoroughly illustrated; furthermore, an expert opinion about this topic is thoroughly discussed. This original review will contribute to better understanding the advantages and highlighting the disadvantages of these chatbots and reveal their challenges and obstacles for medicine and cancer.

2. The Novel Chatbots in Medicine and Cancer

A chatbot, previously known as chatterbot, is a software program or web interface that simulates human conversations through text or voice interactions. Modern chatbots operate online and use advanced artificial intelligence technologies to converse in a natural language with users, mimicking human conversation patterns. While basic chatbots have existed for decades, contemporary versions often incorporate natural language processing and deep learning [6]. Scripted or rule-based chatbots typically rely on standardized user input through menus, tiles, or carousels, with responses based on predetermined rules. These types of chatbots are commonly used in marketing, customer service, and telecommunications to assist clients with common queries using pre-written scripts. Scripted chatbots do not support open-ended conversations. In contrast, AI chatbots, also known as AI conversational agents, process a natural language using neural networks or algorithms instead of relying on pre-written scripts [7]. However, chatbots can sometimes generate coherent yet incorrect or fabricated information, a phenomenon referred to as “hallucinations”. This occurs because chatbots predict responses rather than truly understanding the meaning of the input. When humans use and rely on chatbot-generated content tainted with delusions, it is referred to as “botshit” [8].
In 1950, Alan Turing published the influential article “Computing Machinery and Intelligence”, in which he introduced the Turing test as a measure of machine intelligence. This test evaluates a computer program’s ability to mimic human conversation in real time, making it difficult for a human judge to distinguish between the program and an actual human based solely on the conversation’s content [9]. Joseph Weizenbaum’s 1966 program “ELIZA” attracted attention, as it seemed capable of fooling users into believing that they were interacting with a real human. However, Weizenbaum did not claim that ELIZA was genuinely intelligent. Instead, he presented it as a debunking exercise [10]. Subsequent advancements included the addition of a paranoid patient’s personality to Kenneth Colby’s “PARRY” chatbot. One of the most well-known chatbots is “ALICE”, developed by Richard Wallace in 1995. ALICE avoids improper responses by using pattern-matching algorithms to retrieve example sentences from predefined output templates [11]. With the resurgence of interest in AI and machine learning, chatbots are increasingly prevalent and are being used in various domains. Recognizable web-based voice-activated assistants such as Microsoft Cortana, Apple Siri, Amazon Alexa, and Google Assistant have become widespread following the popularity of chat apps [12]. Despite these advancements, conversational artificial intelligence still faces broadscale challenges, and some software developers focus on using chatbots for information retrieval. Nowadays, chatbots are widely used in messaging apps, internal company platforms, politics, toys, data security, education, and healthcare [4,13,14,15,16,17,18].
A systematic review reported that, in 2021, there existed 5163 chatbots in the Botlist.co directory, out of which 95 were used for healthcare applications, such as diagnosis (12 chatbots), treatment, monitoring, support, workflow, and health promotion, and Messenger was the first reported chatbot in all these categories [11]. Actually, the initial medical chatbots were created and used to provide few easy automated responses to some common patient questions, including office hours and medication refill requests, as well as some other facilitating tasks, such as appointment scheduling. Chatbots were integrated into healthcare information websites via platforms like WebMD and marked an early stage in which they were determined to provide answers to user queries, and further developments include their integration into electronic health record systems so as to streamline administrative tasks and boost healthcare professional efficacy [19,20,21]. Recent developments, however, have thrust chatbots into crucial positions in relation to patient interaction and emotional counseling services, and of particular note, chatbots like Woebot have become important resources for mental health, facilitating meaningful dialogues and providing interventions based on cognitive behavioral therapy. This development highlights the groundbreaking ability of chatbots, including more recent versions like ChatGPT, to go beyond their original function of information delivery and actively engage in patient care; the ability of these AI-powered conversational bots to positively impact patient lifestyle and conduct decisions is becoming more and more apparent as they develop, changing the way that healthcare is delivered and how patients are treated [19]. These days, dozens of chatbots focused on fitness and health are accessible on the internet, and despite the fact that fitness bots are the most common in this area, there are also many medical bots that are valuable. Even if they are not yet widely used in healthcare, chatbots are becoming more and more popular; “FitCircle” and “GymBot” for fitness purposes, “Forksy” and “SlimMe” for nutrition guidance, and “Mendel Health” for oncology are a few prominent instances of specialized industry chatbots [22,23,24,25]. However, there are other issues with putting these technologies into practice and providing them, like the gradual changes in laws and the challenges of acquiring, processing, and retaining private information. Currently, diagnosing a patient in Russia requires a face-to-face consultation between the physician and the patient. Meanwhile, the Ministry of Medicine has mandated that half of all medical consultations must happen online by 2030 [26].
Actually, the so-called COVID-19 aided in pausing face-to-face consultation between the doctor and the patient; thus, it helped in telemedicine and online medical diagnoses and recommendations [27]. ChatGPT, the GPT chatbot, can respond to inquiries from users about disease prevention and health promotion, including immunization and screening, whereas WhatsApp and the WHO have partnered to create a chatbot service that responds to inquiries from users regarding COVID-19 [28]. Indeed, the Indian government introduced the “MyGov Corona Helpdesk” in 2020, which functioned via WhatsApp and provided users with information regarding the COVID-19 pandemic [29].
Chatbots’ architecture is based on a user interface via which the user interacts with it, the natural language understanding component that is responsible for accepting user inputs and extracting the appropriate information, the dialogue management that monitors the conservation’s flow and determines suitable responses, the integration layer that links it to external systems and databases to retrieve data or to act, and the analytics and feedback that monitor usage metrics and collect feedback for its improvement [30,31].
Currently, there exist five main types of chatbots in the domain of health. The data used to train the chatbot or provide readily available knowledge serve as the basis for classifying knowledge domains, including the closed domain, which focuses on more specialized material, and the open domain, which covers general issues. The classification of services is based on the user’s sentimental closeness to the provider and the degree of personal contact based on the task completed, and this can be further subdivided into intrapersonal and interpersonal domains for companionship and personal support to humans, information transmission services, and interagent conversation with other chatbots. The following classification is task-oriented, conversational, and informational, and it is based on goals that are intended to be achieved. Additionally, chatbots that generate responses are categorized as rule-based, retrieval-based, or generative, and they handle the task of interpreting inputs and producing outputs. Lastly, human computation is used in human-aided categorization, which offers greater robustness and flexibility but is slower to process more requests [32,33]. These types of chatbots and their certain recommended apps for each type of chatbots are presented in Table 1.
Particularly in the field of oncology, there exist certain chatbots with various purposes, from the examination of radiologic data to aid in clinical diagnosis and the monitoring of the symptomatology and seriousness of the disease and its stage, to the collection of family history so as to identify hereditary cancer cases, the examination of all data to generate treatment plans for oncologists, the entrance to care institutions and educational data, the daily mental support and tracking, and the general lifestyle coaching, healthy diet, and smoking cessation. Table 2 illustrates the commonly used chatbots with some examples of current certain applications of proposed designs [11].

3. The Benefits of Chatbots in Medicine and Cancer

From a general point of view, chatbots have multifarious advantages, with the most important being that they can provide support 24/7 at any time with instant response regardless of space, and also they provide answers automatically, thus achieving heightened lifetime value, robust brand affinity, and higher levels of satisfaction [34]. Additionally, they reduce the volume of calls and chats overall, allowing support professionals to concentrate on the most important discussions, and in this way, they help in reducing daily stress and anxiety and coping with stressful situations, not to mention also that chatbots can deliver messages that are emotionally comforting and customized to the stressors that users share with them [35,36]. The goal of contemporary chatbots is to converse with people in a conversational manner, and they can offer support in a friendly and engaging manner, as well as answer inquiries and make product recommendations [37]. Another benefit is the fact that some chatbot systems support multilingual text and speech in a fraction of a second, and they are crucial especially in healthcare, where they can support changes in behavior and they can also act like assistants and help with routine tasks and activities in living environments [38,39]. It should be highlighted that chatbots enable new user touch points, and also they improve convenience—through their easy navigation—and deep learning, and they reduce service and support costs for the users [40]. Because they gather useful user information during conversations, such as preferences, browsing history, and activity, chatbots are data-driven tools, and this information can be utilized to customize marketing campaigns, enhance existing products, and make wise decisions with no human mistakes and other manmade errors. Importantly, they display visual content that is more descriptive and realistic, and they provide compliance and security for the user’s personal information [41,42,43,44].
Particularly in the field of medicine, except from the aforementioned, chatbots offer some extra advantages. Chatbots such as ChatGPT have been characterized as beneficial for physicians and researchers, as they can aid in saving time and resources, as well as in providing fresh research perspectives [45]. Regarding the possible patients—the actual users—there are various more specific benefits, including 24/7 availability with rapid and daily responses for critical cases and for individuals with chronic medical conditions, standardized collection of data that can create an electronic health record, quick entry to important information for nearby hospitals/centers/pharmacies and operation hours, determination of assistance in case of a critical health condition requiring immediate attention and care, personalized and nuanced medication planning, scheduling of appointments integrated with online calendars and automatic reminders of them, diagnoses of rare or tricky illnesses and interpretation of complex diagnostic tests, and a lot of support for the person’s overall health, from physical activity to behavioral and mental health needs [46]. Moreover, in this way, some more efficacious patient self-services can be seen, since chatbots can help users to reach the appropriate data and stay healthy and safe, and they can provide a second and a third opinion on a diagnosis or medication plan, regardless of the user’s location [47]. Additionally, given that there are several divergent patient types, such as youngsters, elders, those with problems with their senses, those with mental or/and behavioral issues, those who speak English or not, and those coming from other countries, many chatbots provide a better patient engagement [48]. Aside from their better efficiency (compared with doctors), they can also scale more easily as most of the doctor’s time is spent speaking with patients over the phone or in person, but availability and geography restrict these connections. However, chatbots enable an unlimited number of people with an internet connection to request assistance whenever and from wherever, provided that someone is available to answer. Thus, physicians can more easily control patient demand without incurring additional fees, thanks to this scalability, and they can also maintain low costs while providing top-notch customer service [49]. Some other chatbot benefits in healthcare are the fast and personal interactions with fewer errors, real-time assistance, reduction in administrative tasks for health facilities, and elimination of a patient’s waiting [50].
Especially in cancer cases, conversational AI chatbots have been found to have the potential to provide patients with answers to their questions that are just as good, empathetic, and readable as those given by doctors, and expert AI chatbots that have been trained on large medical text corpora may provide vulnerable populations with information and act as point-of-care electronic medical tools. Additionally, they may offer emotional support to cancer patients and enhance oncology care [51]. Another study has shown that the most reported chatbot applications were for cancer screening, prevention, risk stratification, treatment, monitoring, and management, and the authors concluded that chatbots utilized in oncological care have so far shown a high degree of user satisfaction; several have been effective in enhancing patient-centered communication, making cancer-related information more accessible and facilitating access to care. At the moment, the main drawback of chatbots is their requirement for rigorous user testing and iterative development prior to broad deployment [52]. Other literature data from women living with breast cancer show that, in comparison with the “one size fits all” approach used by healthcare workers to provide information, chatbots are practical and affordable tools to help improve and increase self-care practices and reduce the side effects of chemotherapy, and they can also act as empowering tools to support nurses in educating women with breast cancer and empower them to take an active role in dealing with their symptoms [53]. Importantly, a study on women with breast cancer found that the chatbot’s EORTC INFO25 scores were not less than the doctors’ scores in providing the answers the women needed [54].

4. The Disadvantages of Chatbots in Medicine and Cancer

Some major disadvantages of chatbots are their dependence on technology, internet connection, and the overall automation in general; since chatbots depend on an internet connection to work, users in places with spotty or inconsistent internet connectivity may find it difficult to use chatbot services, yet it is important to take into account the user base’s potential restrictions, particularly with regard to patients [55]. Of course, chatbots have to adapt to the new technology as it evolves and expectations change. Despite the fact that these AI technologies reduce customer costs, they have high initial development costs and maintenance costs [56]. Additionally, using chatbots means that the user has relevant knowledge about this technology; not only are these chatbots complex in certain cases—especially for patients whose condition determines the extent to which they can use them—but also they can cause frustration and aggression for some users because if AI misinterprets the user’s meaning or give irrelevant replies, users may become irritated—something that can also happen if the user cannot handle them appropriately [57]. As is typically the case with technologically driven modifications to current services, some customers—mostly those from older generations—find chatbots uncomfortable because of their limited comprehension, which makes it clear that the robots are handling their demands. Conversations with users that are non-linear and need back-and-forth exchanges are challenging for chatbots to handle [58]. The efficiency of a chatbot is mostly dependent on language processing, which is limited by anomalies like accents and errors, and additionally, because the input/output database is fixed and has a finite capacity, chatbots may malfunction when attempting to answer an unsaved query [59]. While generative neural networks, which are built on algorithms that use deep learning to produce novel answers word by word depending on user input, are often trained on an enormous amount of natural language phrases, chatbots require a large quantity of conversational information to train [60]. Furthermore, apart from the ethical and privacy questions raised about chatbots such as ChatGPT, using them to handle private or sensitive client data might lead to security issues, and it is essential to make sure that data are encrypted, stored safely, and shielded from unwanted access, because security lapses may result in serious repercussions, such as harm to one’s reputation and legal status [61,62].
Chatbots have some extra disadvantages in the medical field, even from a physician’s and a medical researcher’s point of view, since, even for them, the information that they provide maybe not accurate enough [33]. Importantly, adoption is hampered, among other things, by algorithmic and data access restrictions, laws, an excessive dependence on data, as well as a lack of time or resources to undertake research while simultaneously managing and caring for human lives; chatbots run a higher risk of producing erroneous or defective solutions since they require constant problem solving when using a network-generated algorithm [63,64]. Additionally, a lack of transparency causes healthcare professionals to use them less and have less faith in them, whereas adoption is further hampered by poor information availability, a lack of time for clinical research, and a laborious data collection procedure that may negatively affect patient flow [65,66]. Additionally, the inability to gather huge data sets hinders technology adoption, and the matter of delayed implementation is aggravated by the extremely tightly controlled structure of healthcare, risks associated with liability, privacy difficulties, and an onerous approval procedure for the latest innovations [63,64]. Moreover, based on their philosophy, no human interaction is always offered, and some people may feel shy and uncomfortable to talk with an automated non-human system and share personal information and very sensitive health matters, and chatbots may not be as trustworthy to some as a genuine person, who can give them tailored advice and respond to their questions instantly—particularly for cancer or other serious health problems [65]. Apart from the limited information, the security concerns, the system overload, and the issues regarding data inaccuracies and credibility, chatbots rely on big data and AI; by using a chatbot for healthcare services, this may indicate that numerous organizations have gained access to your private details, while big data and AI can be costly for startups or small businesses that do not currently have the resources to operate efficiently or still need access to this kind of technology [50,66]. Last but not least, it should be noted that, aside from the incapability of chatbots to give access to healthcare professional and specialists, it must be highlighted that, sometimes, they provide inaccurate medical advice, as, given that they are human-programmed, they are prone to mistakes, some of which can be extremely harmful and serious, such as providing the patient with incorrect medical instructions or implying that they have developed a new, nonexistent ailment. Such misdiagnosis can occur due to various reasons: erroneous system diagnosis, wrong estimation from AI, inadequacy of chatbot questions, and inaccurate data provided by the user either due to distress/uneasiness or underestimation/negligence of their symptomatology [67,68].
Particularly in oncology, in one retrospective, cross-sectional study, the researchers assessed how well OpenAI’s commercial ChatGPT produced treatment recommendations for lung, prostate, and breast cancer, and compared those recommendations with the National Comprehensive Cancer Network standard of care, and they discovered that about one-third of the chatbot’s treatment recommendations deviated from the its guidelines, and the recommendations differed depending on how the questions were phrased. Differences between the guidelines and the chatbot’s output were frequently ascribed to unclear answers, suggesting that care should be used when utilizing chatbots to obtain treatment-related information [69]. In a relevant study, researchers contrasted the quality of data produced by ChatGPT v3.5 with that of Perplexity, Chatsonic, and Bing AI for the top Google search queries pertaining to five common cancers (lung, skin, colorectal, breast, and prostate), and despite having an excellent median DISCERN score, the replies were difficult to grasp and not easily actionable at the collegiate reading level [70]. Figure 1 shows all the advantages and disadvantages of chatbots in medicine and cancer, as previously discussed.
Based on these, the aforementioned advantages and disadvantages of chatbots in medicine and cancer seen in Figure 1, the following Table 3 presents the major implications of the chatbots for cancer.
Therefore, it is obvious that, regardless of the various advantages that the chatbots in cancer in particular may offer, there arise various disadvantages as well that should be taken into account before using them.

5. The Current Issues and Challenges of Chatbots

The major challenges of chatbots at first glance include the excellent design of its style, data gathering and algorithms, and the need for continuous maintenance; nevertheless, understanding the messages and figuring out the user’s intention is one of the main problems in using chatbots for customer service since, when creating a chatbot, the first thing that has to be done is to program adaptable algorithms for determining the message’s intention [71]. Personalization is another challenge of chatbots, yet the difficulty lies in figuring out the best ways to adjust to the user, and it can only be resolved by making several attempts and failing in each unique situation [71,72,73]. A significant obstacle associated with tailoring and modifying chatbot behavior is comprehending the limitations of natural language processing, since, while it is the foundation of every chatbot, the overuse of it may be compared with thinking up an angry, upside-down elephant in a cloud, whereas, put another way, it might turn out to be just as confusing as any periods of a cat sitting on the keyboard; however, that is an extreme example. Things are not always so bad, but miscommunication takes place [74]. Another option is the machine learning, but for it to work well, it requires a very specific set of guidelines, and if not, chaos will ensue. Moreover, undeniably, another major challenge for chatbots is digital literacy, because there exist various types of users, and especially old ones and those who are not able to use such technology—due to either the absence of knowledge or other personal or health reasons—may be not able to use a chatbot or/and interpret the meanings of its responses [75].
Particularly in the field of healthcare, to be able to identify specific patterns in big data, developers and suppliers of AI/machine learning-driven medical products need a significant amount, speed, variety, and veracity of medical data [76]. Additionally, it was previously discussed that privacy concerns are a common obstacle, along with the user’s capability of providing real answers and his/her trust—which is affected by various factors apart from empathy and emotional intelligence [77]. One thing that restricts the usage of chatbots is that, in the early phases of their creation or use, they might not be able to comprehend patients well enough to respond appropriately to certain types of questions. It has been said that chatbots in medicine are a medical supplement and not a substitute, because they remain short of the human qualities of empathy, instinct, and prior experience that make healthcare workers human [78]. Furthermore, AI chatbots raise serious questions about computational bias and fairness as they become more and more prevalent in the healthcare industry, as they may reflect the inherent biases or lack of diversity in the training data. It is crucial to make sure that the creation and application automated chatbot models within the healthcare industry follow the rules of justice and equity because of the possibility of unfavorable results, because reaching this goal can support fair healthcare outcomes and access for all populations, irrespective of their demographic makeup [79,80,81]. Moreover, AI chatbots in the healthcare industry may face substantial challenges navigating regulatory environments; besides, the fast developing field of chatbot software and the absence of industry standards in chatbot applications exacerbate the difficulties involved in regulatory assessment and supervision [82,83]. It should also be mentioned that, apart from acceptability, health literacy matters too; users should be capable of reading and comprehending the chatbot’s outputs, yet there are various users with different knowledge, and this is a next-stage limitation of digital health [84,85]. Last but not least, even if doctors currently use technology and AI to help them to find appropriate ways to deliver breaking bad news to a patient, chatbots are not humans, and it seems difficult for a non-human means to communicate bad news with the user—who is actually a human—particularly for diagnosis, but also for therapeutic strategies or other issues, not to mention that a possible false diagnosis has detrimental consequences to the user [86]. Despite their acceptance by doctors, there are still issues with chatbots’ incapacity to understand human emotion and situations when medical intelligence and expertise are needed [87].
It was discussed in the previous section that the main drawback of chatbots in oncology is that they must undergo rigorous user testing and iterative development before being widely used [52]. As with all diseases and other reasons to use a chatbot, challenges include data privacy, security, and ethical aspects [88]. Figure 2 summarizes the current challenges and obstacles of chatbots in medicine and cancer.

6. The Expert Opinion on Chatbots and Oncology: All That Glitters Is Not Gold

AI technology and novel chatbots have remarkably revolutionized modern life to a great extent. Given the fact that chatbots in oncology are not such common but will be in the near future, several points and aspects from this critical state-of-the-art review should be taken into account, so as for this technology to be widely accepted, adopted, and efficient for society. In the previous sections, the benefits, the disadvantages, and the challenges of novel chatbots have been thoroughly presented. In this section, some crucial aspects about this topic will be extensively illustrated.
Undoubtedly, chatbots offer several advantages for the user and society, but a lot of disadvantages and challenges are behind them. Nevertheless, some efforts to overcome these obstacles have been recommended, such as regularly updating the chatbot’s base of knowledge, incorporating machine learning and natural language processing, administering data validation and verification, ensuring data safety, and providing human assistance wherever possible [45]. Not only will chatbots be beneficial for researchers, academics, and even university medical students for education purposes, but also they will help in societal health literacy, and in this way, more and more people will have knowledge on health aspects; even physicians have characterized them as being undergraduates in the school of medicine and new members in the medical society [1].
Of course, prevention is better than cure, and an early diagnosis can show promising results; many chatbots aid in the prevention and the early diagnosis of communicative and non-communicative diseases—with the first being critical especially in cases of epidemics and pandemics such as COVID-19. However, if one is digitally illiterate and not familiar with AI technologies and chatbots, then he will not be diagnosed appropriately through them, and he will continue to be ill; this scenario is critical particularly for communicable diseases that have consequences at the societal level and serious conditions characterized by underlying systemic inflammation, such as autoimmunity and cancer [89]. Given the fact that there are a growing number of people with such underlying medical conditions, particularly elders who also may not be familiar with AI and chatbots, one could argue that this is a major obstacle and difficult to overcome.
Chatbots are personalized, but they may not be personable. Behind the digital scenes, there exists the same impersonality, and one’s history cannot be totally monitored through them; such complex algorithms would cause a crash into the system and trigger incorrect outputs to the user. Additionally, chatbots are capable of producing coherent-sounding but erroneous or invented information, a phenomenon known as “hallucinations”, because they predict answers rather than understanding the meaning of the responses [9]. This AI technology uses determined standard questions in order to have results, yet symptoms can overlap and occur due to other underlying medical conditions, and even differential diagnosis is evident in the output; the user—who is not an expert—cannot interpret it. Clinical judgment is not entirely replicated by AI systems; factors such as patient composure, cognitive function, and clinical state are important in determining patient danger but are not conclusively recorded in data. It is possible to increase model accuracy and beneficial relevance by extending prognostic AI models to incorporate patient-reported outcomes that continuously record symptoms and functional ability outside of the clinic. Importantly, a user can report fake symptomatology and trigger again a false diagnosis; however, fake results and misdiagnosis are possible not only in digital diagnosis but also in clinical and laboratory diagnosis—the last was recently seen in false COVID-19 test results and even Progressive Multifocal Leukoencephalopathy diagnosis [90,91,92,93,94,95,96].
Whether it be a real or false diagnosis, in most cases, breaking bad news are delivered to the possible patient. In case of an accurate diagnosis, one could argue that it is cold and inhumane for a patient to be informed about a poor prognosis, a bad or worsened diagnosis, or a failure in therapy via a chatbot. This would be important particularly in cancer cases, whose psychology is extremely affected by such diagnosis. In case of a false diagnosis, one could argue that this could have significant consequences on one’s mental health and his family, since a cancer diagnosis is believed to be a heavy burden and even a stigma for most people.
Communication and interactions between patients, caregivers, healthcare practitioners, and the larger healthcare ecosystem are all included in the diverse field of healthcare communication. It has long been understood that good communication is essential to providing high-quality healthcare, and it is essential for early health issue discovery, treatment plan adherence, patient education, and general patient happiness; however, traditional healthcare communications on strategies have faced both opportunities and challenges with the arrival of the digital age [19]. Of course, digital diagnosis cannot substitute clinical and laboratory diagnosis, and in case of prognosis, diagnosis, and therapeutic strategy planning, communication between patients and experts is the gold standard; this is a very important parameter that influences the patient’s outcomes—especially in cancer cases. Such people face some extra obstacles, particularly psychological ones, since psychology matters in cancer, and of course, the disadvantages, obstacles, and challenges of chatbots for cancer are much heavier—compared with other health conditions.

7. Conclusions

AI technology and novel chatbots have remarkably revolutionized modern life, particularly in the field of medicine. Despite the fact that such technologies have various advantages, there exist disadvantages too, and important challenges and obstacles have arisen, especially with regard to their use in cancer cases. Of course, people, and especially cancer cases, should be informed about the chatbot technology and use it only in a few specific and safe situations when the benefits outweigh the risks.

Author Contributions

Conceptualization, G.G. and E.S.; methodology, G.G.; software, G.G.; validation, G.G. and E.S.; formal analysis, G.G.; investigation, G.G.; resources, G.G.; data curation, G.G.; writing—original draft preparation, G.G.; writing—review and editing, T.I.D., K.S., N.S. and E.S.; visualization, G.G.; supervision, E.S.; project administration, G.G.; funding acquisition, G.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Meskó, B. COVID-19’s Impact on Digital Health Adoption: The Growing Gap between a Technological and a Cultural Transformation. JMIR Hum. Factors 2022, 9, e38926. [Google Scholar] [CrossRef]
  2. Yang, K.; Hu, Y.; Qi, H. Digital Health Literacy: Bibliometric Analysis. J. Med. Internet Res. 2022, 24, e35816. [Google Scholar] [CrossRef] [PubMed]
  3. Mouliou, D.S.; Pantazopoulos, I.; Gourgoulianis, K.I. COVID-19 Smart Diagnosis in the Emergency Department: All-in in Practice. Expert. Rev. Respir. Med. 2022, 16, 263–272. [Google Scholar] [CrossRef] [PubMed]
  4. Xue, J.; Zhang, B.; Zhao, Y.; Zhang, Q.; Zheng, C.; Jiang, J.; Li, H.; Liu, N.; Li, Z.; Fu, W.; et al. Evaluation of the Current State of Chatbots for Digital Health: Scoping Review. J. Med. Internet Res. 2023, 25, e47217. [Google Scholar] [CrossRef] [PubMed]
  5. Global Cancer Burden Growing, amidst Mounting Need for Services. Available online: https://www.who.int/news/item/01-02-2024-global-cancer-burden-growing--amidst-mounting-need-for-services (accessed on 9 June 2024).
  6. McTear, M.; Callejas, Z.; Griol, D. The Conversational Interface; Springer International Publishing: Cham, Switzerland, 2016; ISBN 9783319329659. [Google Scholar]
  7. Nicolescu, L.; Tudorache, M.T. Human-Computer Interaction in Customer Service: The Experience with AI Chatbots—A Systematic Literature Review. Electronics 2022, 11, 1579. [Google Scholar] [CrossRef]
  8. Hannigan, T.R.; McCarthy, I.P.; Spicer, A. Beware of Botshit: How to Manage the Epistemic Risks of Generative Chatbots. Bus. Horiz. 2024; in press. [Google Scholar] [CrossRef]
  9. Turing, A.M. I.—Computing machinery and intelligence. Mind 1950, LIX, 433–460. [Google Scholar] [CrossRef]
  10. Weizenbaum, J. ELIZA—A Computer Program for the Study of Natural Language Communication between Man and Machine. Commun. ACM 1966, 9, 36–45. [Google Scholar] [CrossRef]
  11. Xu, L.; Sanders, L.; Li, K.; Chow, J.C.L. Chatbot for Health Care and Oncology Applications Using Artificial Intelligence and Machine Learning: Systematic Review. JMIR Cancer 2021, 7, e27850. [Google Scholar] [CrossRef]
  12. AbuShawar, B.; Atwell, E. ALICE Chatbot: Trials and Outputs. Comput. y Sist. 2015, 19, 625–632. [Google Scholar] [CrossRef]
  13. Amalia, A.; Suprayogi, M. Engaging Millennials on Using Chatbot Messenger for Eco-Tourism; Atlantis Press: Paris, France, 2019; pp. 484–487. [Google Scholar]
  14. Chandel, S.; Yuying, Y.; Yujie, G.; Razaque, A.; Yang, G. Chatbot: Efficient and Utility-Based Platform. In Proceedings of the Intelligent Computing; Arai, K., Kapoor, S., Bhatia, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 109–122. [Google Scholar]
  15. Kim, Y.; Lee, H. The Rise of Chatbots in Political Campaigns: The Effects of Conversational Agents on Voting Intention. Int. J. Hum. Comput. Interact. 2023, 39, 3984–3995. [Google Scholar] [CrossRef]
  16. Lin, P.-C.; Yankson, B.; Lu, Z.; Hung, P.C.K. Children Privacy Identification System in LINE Chatbot for Smart Toys. In Proceedings of the 2019 IEEE 12th International Conference on Cloud Computing (CLOUD), Milan, Italy, 8–13 July 2019; pp. 86–90. [Google Scholar]
  17. Lai, S.-T.; Leu, F.-Y.; Lin, J.-W. A Banking Chatbot Security Control Procedure for Protecting User Data Security and Privacy. In Proceedings of the Advances on Broadband and Wireless Computing, Communication and Applications; Barolli, L., Leu, F.-Y., Enokido, T., Chen, H.-C., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 561–571. [Google Scholar]
  18. Liu, L.; Subbareddy, R.; Raghavendra, C.G. AI Intelligence Chatbot to Improve Students Learning in the Higher Education Platform. J. Interconnect. Netw. 2022, 22, 2143032. [Google Scholar] [CrossRef]
  19. Sun, G.; Zhou, Y.-H. AI in Healthcare: Navigating Opportunities and Challenges in Digital Communication. Front. Digit. Health 2023, 5, 1291132. [Google Scholar] [CrossRef] [PubMed]
  20. Goel, R.; Goswami, R.P.; Totlani, S.; Arora, P.; Bansal, R.; Vij, D. Machine learning based healthcare chatbot. In Proceedings of the 2022 2nd International Conference on Advance Computing, Innovative Technologies in Engineering (ICACITE), Greater Noida, India, 28–29 April 2022; pp. 188–192. [Google Scholar]
  21. Kocakoç, I.D. The role of artificial intelligence in health care. In The Impact of Artificial Intelligence on Governance, Economics, Finance; Springer: New York, NY, USA, 2022; Volume 2, pp. 189–206. [Google Scholar]
  22. FitCircle: Where Bots Make Daily Eating Habits Healthier. Available online: https://www.firstpost.com/tech/startup/fitcircle-where-bots-make-daily-eating-habits-healthier-3725943.html (accessed on 10 June 2024).
  23. Sawers, P. Gymbot: A Bot That Tracks Your Workouts through Facebook Messenger; VentureBeat: San Francisco, CA, USA, 2016. [Google Scholar]
  24. Rahmanti, A.R.; Yang, H.-C.; Bintoro, B.S.; Nursetyo, A.A.; Muhtar, M.S.; Syed-Abdul, S.; Li, Y.-C.J. SlimMe, a Chatbot with Artificial Empathy for Personal Weight Management: System Design and Finding. Front. Nutr. 2022, 9, 870775. [Google Scholar] [CrossRef]
  25. Mendel AI—Know More, Know Now. Available online: https://www.mendel.ai/ (accessed on 10 June 2024).
  26. Medical-Chat Bot: The History of Our Attempt to Do It—Andrey Lukyanenko. Available online: https://andlukyane.com/blog/medical-chat-bot (accessed on 10 June 2024).
  27. Bouabida, K.; Lebouché, B.; Pomey, M.-P. Telehealth and COVID-19 Pandemic: An Overview of the Telehealth Use, Advantages, Challenges, and Opportunities during COVID-19 Pandemic. Healthcare 2022, 10, 2293. [Google Scholar] [CrossRef] [PubMed]
  28. Ahaskar, A. How WhatsApp Chatbots Are Helping in the Fight against COVID-19. Available online: https://www.livemint.com/technology/tech-news/how-whatsapp-chatbots-are-helping-in-the-fight-against-covid-19-11585310168911.html (accessed on 10 June 2024).
  29. India’s Coronavirus Chatbot on WhatsApp Crosses 1.7 Crore Users in 10 Days. Available online: https://www.gadgets360.com/apps/news/coronavirus-mygov-corona-helpdesk-chatbot-whatsapp-indian-government-total-users-haptik-2204458 (accessed on 10 June 2024).
  30. Matic, R.; Kabiljo, M.; Zivkovic, M.; Cabarkapa, M. Extensible Chatbot Architecture Using Metamodels of Natural Language Understanding. Electronics 2021, 10, 2300. [Google Scholar] [CrossRef]
  31. Developing AI Chatbots: Challenges and Considerations. Available online: https://www.linkedin.com/pulse/developing-ai-chatbots-challenges-considerations-artemakis-artemiou-yelif (accessed on 12 June 2024).
  32. Hien, H.T.; Cuong, P.-N.; Nam, L.N.H.; Nhung, H.L.T.K.; Thang, L.D. Intelligent Assistants in Higher-Education Environments: The FIT-EBot, a Chatbot for Administrative and Learning Support. In Proceedings of the 9th International Symposium on Information and Communication Technology; Association for Computing Machinery: New York, NY, USA, 2018; pp. 69–76. [Google Scholar]
  33. Kucherbaev, P.; Bozzon, A.; Houben, G.-J. Human-Aided Bots. IEEE Internet Comput. 2018, 22, 36–43. [Google Scholar] [CrossRef]
  34. Ta, V.; Griffith, C.; Boatfield, C.; Wang, X.; Civitello, M.; Bader, H.; DeCero, E.; Loggarakis, A. User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis. J. Med. Internet Res. 2020, 22, e16235. [Google Scholar] [CrossRef]
  35. Medeiros, L.; Gerritsen, C.; Bosse, T. Towards Humanlike Chatbots Helping Users Cope with Stressful Situations. In Proceedings of the Computational Collective Intelligence; Nguyen, N.T., Chbeir, R., Exposito, E., Aniorté, P., Trawiński, B., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 232–243. [Google Scholar]
  36. Medeiros, L.; Bosse, T.; Gerritsen, C. Can a Chatbot Comfort Humans? Studying the Impact of a Supportive Chatbot on Users’ Self-Perceived Stress. IEEE Trans. Hum. Mach. Syst. 2022, 52, 343–353. [Google Scholar] [CrossRef]
  37. Jin, E.; Eastin, M.S. Birds of a Feather Flock Together: Matched Personality Effects of Product Recommendation Chatbots and Users. J. Res. Interact. Mark. 2022, 17, 416–433. [Google Scholar] [CrossRef]
  38. Badlani, S.; Aditya, T.; Dave, M.; Chaudhari, S. Multilingual Healthcare Chatbot Using Machine Learning. In Proceedings of the 2021 2nd International Conference for Emerging Technology (INCET), Belagavi, India, 21–23 May 2021; pp. 1–6. [Google Scholar]
  39. Rojc, M.; Ariöz, U.; Šafran, V.; Mlakar, I. Multilingual Chatbots to Collect Patient-Reported Outcomes. In Chatbots—The AI-Driven Front-Line Services for Customers; IntechOpen: London, UK, 2023; ISBN 9781837689323. [Google Scholar]
  40. Darius, Z.; Hundertmark, S. Chatbots—An interactive technology for personalized communication, transactions and services. IADIS Int. J. WWW/Internet 2017, 95, 96–109. Available online: https://www.researchgate.net/profile/Darius-Zumstein/publication/322855718_Chatbots_-_An_Interactive_Technology_for_Personalized_Communication_Transactions_and_Services/links/5a72ecde458515512076b406/Chatbots-An-Interactive-Technology-for-Personalized-Communication-Transactions-and-Services.pdf (accessed on 17 July 2024).
  41. Reshmi, S.; Balakrishnan, K. Empowering chatbots with business intelligence by big data integration. Int. J. Adv. Res. Comput. Sci. 2018, 9, 627–631. [Google Scholar] [CrossRef]
  42. Hopkins, A.M.; Logan, J.M.; Kichenadasse, G.; Sorich, M.J. Artificial Intelligence Chatbots Will Revolutionize How Cancer Patients Access Information: ChatGPT Represents a Paradigm-Shift. JNCI Cancer Spectr. 2023, 7, pkad010. [Google Scholar] [CrossRef] [PubMed]
  43. Sheehan, B.; Jin, H.S.; Gottlieb, U. Customer Service Chatbots: Anthropomorphism and Adoption. J. Bus. Res. 2020, 115, 14–24. [Google Scholar] [CrossRef]
  44. Følstad, A.; Skjuve, M.; Brandtzaeg, P.B. Different Chatbots for Different Purposes: Towards a Typology of Chatbots to Understand Interaction Design. In Proceedings of the Internet Science; Bodrunova, S.S., Koltsova, O., Følstad, A., Halpin, H., Kolozaridi, P., Yuldashev, L., Smoliarova, A., Niedermayer, H., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 145–156. [Google Scholar]
  45. Sharma, P. Chatbots in Medical Research: Advantages and Limitations of Artificial Intelligence–Enabled Writing With a Focus on ChatGPT as an Author. Clin. Nucl. Med. 2023, 48, 838–839. [Google Scholar] [CrossRef] [PubMed]
  46. Clark, M.; Bailey, S. Chatbots in Health Care: Connecting Patients to Information. Can. J. Health Technol. 2024, 4. [Google Scholar] [CrossRef]
  47. Fan, X.; Chao, D.; Zhang, Z.; Wang, D.; Li, X.; Tian, F. Utilization of Self-Diagnosis Health Chatbots in Real-World Settings: Case Study. J. Med. Internet Res. 2021, 23, e19928. [Google Scholar] [CrossRef] [PubMed]
  48. Sadasivan, C.; Cruz, C.; Dolgoy, N.; Hyde, A.; Campbell, S.; McNeely, M.; Stroulia, E.; Tandon, P. Examining Patient Engagement in Chatbot Development Approaches for Healthy Lifestyle and Mental Wellness Interventions: Scoping Review. J. Particip. Med. 2023, 15, e45772. [Google Scholar] [CrossRef] [PubMed]
  49. Kowatsch, T.; Nißen, M.; Shih, C.-H.I.; Rüegger, D.; Volland, D.; Filler, A.; Künzler, F.; Barata, F.; Hung, S.; Büchter, D.; et al. Text-Based Healthcare Chatbots Supporting Patient and Health Professional Teams: Preliminary Results of a Randomized Controlled Trial on Childhood Obesity; ETH Zurich: Zurich, Switzerland, 2017; 11p. [Google Scholar] [CrossRef]
  50. Rana, J. The Pros and Cons of Healthcare Chatbots; REVE Chat: Singapore, 2023. [Google Scholar]
  51. Chen, D.; Parsa, R.; Hope, A.; Hannon, B.; Mak, E.; Eng, L.; Liu, F.-F.; Fallah-Rad, N.; Heesters, A.M.; Raman, S. Physician and Artificial Intelligence Chatbot Responses to Cancer Questions from Social Media. JAMA Oncol. 2024, 10, 956–960. [Google Scholar] [CrossRef]
  52. Wang, A.; Qian, Z.; Briggs, L.; Cole, A.P.; Reis, L.O.; Trinh, Q.-D. The Use of Chatbots in Oncological Care: A Narrative Review. Int. J. Gen. Med. 2023, 16, 1591–1602. [Google Scholar] [CrossRef]
  53. Tawfik, E.; Ghallab, E.; Moustafa, A. A Nurse versus a Chatbot—The Effect of an Empowerment Program on Chemotherapy-Related Side Effects and the Self-Care Behaviors of Women Living with Breast Cancer: A Randomized Controlled Trial. BMC Nurs. 2023, 22, 102. [Google Scholar] [CrossRef] [PubMed]
  54. Bibault, J.-E.; Chaix, B.; Guillemassé, A.; Cousin, S.; Escande, A.; Perrin, M.; Pienkowski, A.; Delamon, G.; Nectoux, P.; Brouard, B. A Chatbot Versus Physicians to Provide Information for Patients With Breast Cancer: Blind, Randomized Controlled Noninferiority Trial. J. Med. Internet Res. 2019, 21, e15787. [Google Scholar] [CrossRef]
  55. Melián-González, S.; Gutiérrez-Taño, D.; Bulchand-Gidumal, J. Predicting the Intentions to Use Chatbots for Travel and Tourism. Curr. Issues Tour. 2021, 24, 192–210. [Google Scholar] [CrossRef]
  56. Okonkwo, C.W.; Ade-Ibijola, A. Chatbots Applications in Education: A Systematic Review. Comput. Educ. Artif. Intell. 2021, 2, 100033. [Google Scholar] [CrossRef]
  57. Ozuem, W.; Ranfagni, S.; Willis, M.; Salvietti, G.; Howell, K. Exploring the Relationship between Chatbots, Service Failure Recovery and Customer Loyalty: A Frustration–Aggression Perspective. Psychol. Mark. 2024; early view. [Google Scholar] [CrossRef]
  58. Grudin, J.; Jacques, R. Chatbots, Humbots, and the Quest for Artificial General Intelligence. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–11. [Google Scholar]
  59. Gupta, M. Chatbots—Boon or Bane? Available online: https://blog.bluelupin.com/chatbot-advantages-and-disadvantages/ (accessed on 11 June 2024).
  60. Caldarini, G.; Jaf, S.; McGarry, K. A Literature Survey of Recent Advances in Chatbots. Information 2022, 13, 41. [Google Scholar] [CrossRef]
  61. Yang, J.; Chen, Y.-L.; Por, L.Y.; Ku, C.S. A Systematic Literature Review of Information Security in Chatbots. Appl. Sci. 2023, 13, 6355. [Google Scholar] [CrossRef]
  62. Bang, J.; Kim, S.; Nam, J.W.; Yang, D.-G. Ethical Chatbot Design for Reducing Negative Effects of Biased Data and Unethical Conversations. In Proceedings of the 2021 International Conference on Platform Technology and Service (PlatCon), Jeju, Republic of Korea, 23–25 August 2021; pp. 1–5. [Google Scholar]
  63. Singh, J.; Sillerud, B.; Singh, A. Artificial Intelligence, Chatbots and ChatGPT in Healthcare—Narrative Review of Historical Evolution, Current Application, and Change Management Approach to Increase Adoption. J. Med. Artif. Intell. 2023, 6. [Google Scholar] [CrossRef]
  64. Why Is AI Adoption in Health Care Lagging? Available online: https://www.brookings.edu/articles/why-is-ai-adoption-in-health-care-lagging/ (accessed on 11 June 2024).
  65. Brown, J.E.H.; Halpern, J. AI Chatbots Cannot Replace Human Interactions in the Pursuit of More Inclusive Mental Healthcare. SSM Ment. Health 2021, 1, 100017. [Google Scholar] [CrossRef]
  66. Xiao, Z.; Liao, Q.V.; Zhou, M.; Grandison, T.; Li, Y. Powering an AI Chatbot with Expert Sourcing to Support Credible Health Information Access. In Proceedings of the 28th International Conference on Intelligent User Interfaces; Association for Computing Machinery: New York, NY, USA, 2023; pp. 2–18. [Google Scholar]
  67. Tuncel, F.; Mumcu, B.; Tanberk, S. A Chatbot for Preliminary Patient Guidance System. In Proceedings of the 2021 29th Signal Processing and Communications Applications Conference (SIU), Istanbul, Turkey, 9–11 June 2021; pp. 1–4. [Google Scholar]
  68. Kuroiwa, T.; Sarcon, A.; Ibara, T.; Yamada, E.; Yamamoto, A.; Tsukamoto, K.; Fujita, K. The Potential of ChatGPT as a Self-Diagnostic Tool in Common Orthopedic Diseases: Exploratory Study. J. Med. Internet Res. 2023, 25, e47621. [Google Scholar] [CrossRef]
  69. Chen, S.; Kann, B.H.; Foote, M.B.; Aerts, H.J.W.L.; Savova, G.K.; Mak, R.H.; Bitterman, D.S. Use of Artificial Intelligence Chatbots for Cancer Treatment Information. JAMA Oncol. 2023, 9, 1459. [Google Scholar] [CrossRef]
  70. Pan, A.; Musheyev, D.; Bockelman, D.; Loeb, S.; Kabarriti, A.E. Assessment of Artificial Intelligence Chatbot Responses to Top Searched Queries About Cancer. JAMA Oncol. 2023, 9, 1437. [Google Scholar] [CrossRef]
  71. Tantsiura, P. 5 Challenges of Chatbots for Business and How to Overcome Them. Available online: https://theappsolutions.com/blog/development/challenges-of-chatbots-for-business/ (accessed on 12 June 2024).
  72. Fritsch, T. Chatbots: An Overview of Current Issues and Challenges. In Proceedings of the Advances in Information and Communication; Arai, K., Ed.; Springer Nature Switzerland: Cham, Switzerland, 2024; pp. 84–104. [Google Scholar]
  73. Loh, E. ChatGPT and Generative AI Chatbots: Challenges and Opportunities for Science, Medicine and Medical Leaders. BMJ Lead. 2024, 8. [Google Scholar] [CrossRef] [PubMed]
  74. Gökçearslan, Ş.; Tosun, C.; Erdemir, Z.G. Benefits, Challenges, and Methods of Artificial Intelligence (AI) Chatbots in Education: A Systematic Literature Review. IJTE 2024, 7, 19–39. [Google Scholar] [CrossRef]
  75. Tinmaz, H.; Lee, Y.-T.; Fanea-Ivanovici, M.; Baber, H. A Systematic Review on Digital Literacy. Smart Learn. Environ. 2022, 9, 21. [Google Scholar] [CrossRef]
  76. Rezaeikhonakdar, D. AI Chatbots and Challenges of HIPAA Compliance for AI Developers and Vendors. J. Law Med. Ethics 2024, 51, 988–995. [Google Scholar] [CrossRef] [PubMed]
  77. Fink, J. Can Artificial Intelligence Chatbots Convincingly Mimic Empathy? Am. J. Nurs. 2023, 123, 13. [Google Scholar] [CrossRef]
  78. Altamimi, I.; Altamimi, A.; Alhumimidi, A.S.; Altamimi, A.; Temsah, M.-H. Artificial Intelligence (AI) Chatbots in Medicine: A Supplement, Not a Substitute. Cureus 2023, 15, e40922. [Google Scholar] [CrossRef]
  79. Overman, T.; Blum, G.; Klabjan, D. A Primal-Dual Algorithm for Hybrid Federated Learning. arXiv 2022, arXiv:2210.08106. [Google Scholar] [CrossRef]
  80. Benjamin, R. Race after Technology: Abolitionist Tools for the New Jim Code; Polity: Cambridge, UK, 2019; p. 172. ISBN 978-1-509-52643-7. [Google Scholar]
  81. Cath, C. Governing Artificial Intelligence: Ethical, Legal and Technical Opportunities and Challenges. Phil. Trans. R. Soc. A 2018, 376, 20180080. [Google Scholar] [CrossRef]
  82. Cohen, I.G.; Amarasingham, R.; Shah, A.; Xie, B.; Lo, B. The Legal And Ethical Concerns That Arise From Using Complex Predictive Analytics In Health Care. Health Aff. 2014, 33, 1139–1147. [Google Scholar] [CrossRef]
  83. Vayena, E.; Blasimme, A.; Cohen, I.G. Machine Learning in Medicine: Addressing Ethical Challenges. PLoS Med. 2018, 15, e1002689. [Google Scholar] [CrossRef]
  84. Almalki, M.; Azeez, F. Health Chatbots for Fighting COVID-19: A Scoping Review. Acta Inf. Med. 2020, 28, 241–247. [Google Scholar] [CrossRef] [PubMed]
  85. Schillaci, C.E.; de Cosmo, L.M.; Piper, L.; Nicotra, M.; Guido, G. Anthropomorphic Chatbots’ for Future Healthcare Services: Effects of Personality, Gender, and Roles on Source Credibility, User Satisfaction, and Intention to Use. Technol. Forecast. Soc. Change 2024, 199, 123025. [Google Scholar] [CrossRef]
  86. Suppadungsuk, S.; Thongprayoon, C.; Miao, J.; Krisanapan, P.; Qureshi, F.; Kashani, K.; Cheungpasitporn, W. Exploring the Potential of Chatbots in Critical Care Nephrology. Medicines 2023, 10, 58. [Google Scholar] [CrossRef]
  87. Palanica, A.; Flaschner, P.; Thommandram, A.; Li, M.; Fossat, Y. Physicians’ Perceptions of Chatbots in Health Care: Cross-Sectional Web-Based Survey. J. Med. Internet Res. 2019, 21, e12887. [Google Scholar] [CrossRef]
  88. Tripathi, S.; Tabari, A.; Mansur, A.; Dabbara, H.; Bridge, C.P.; Daye, D. From Machine Learning to Patient Outcomes: A Comprehensive Review of AI in Pancreatic Cancer. Diagnostics 2024, 14, 174. [Google Scholar] [CrossRef] [PubMed]
  89. Mouliou, D.S. C-Reactive Protein: Pathophysiology, Diagnosis, False Test Results and a Novel Diagnostic Algorithm for Clinicians. Diseases 2023, 11, 132. [Google Scholar] [CrossRef] [PubMed]
  90. Mouliou, D.S.; Gourgoulianis, K.I. False-Positive and False-Negative COVID-19 Cases: Respiratory Prevention and Management Strategies, Vaccination, and Further Perspectives. Expert. Rev. Respir. Med. 2021, 15, 993–1002. [Google Scholar] [CrossRef]
  91. Mouliou, D.S.; Dardiotis, E. Current Evidence in SARS-CoV-2 mRNA Vaccines and Post-Vaccination Adverse Reports: Knowns and Unknowns. Diagnostics 2022, 12, 1555. [Google Scholar] [CrossRef]
  92. Mouliou, D.S.; Gourgoulianis, K.I. COVID-19 ‘Asymptomatic’ Patients: An Old Wives’ Tale. Expert. Rev. Respir. Med. 2022, 16, 399–407. [Google Scholar] [CrossRef]
  93. Mouliou, D.S.; Pantazopoulos, I.; Gourgoulianis, K. COVID-19 Diagnosis in the Emergency Department: Seeing the Tree but Losing the Forest. Emerg. Med. J. 2022, 39, 563. [Google Scholar] [CrossRef] [PubMed]
  94. Mouliou, D.S. The Deceptive COVID-19: Lessons from Common Molecular Diagnostics and a Novel Plan for the Prevention of the Next Pandemic. Diseases 2023, 11, 20. [Google Scholar] [CrossRef]
  95. Mouliou, D.S. Managing Viral Emerging Infectious Diseases via Current Molecular Diagnostics in the Emergency Department: The Tricky Cases. Expert Review of Anti-infective Therapy. Expert Rev. Anti-Infect. Ther. 2022, 20, 1163–1169. [Google Scholar] [CrossRef] [PubMed]
  96. Mouliou, D.S. John Cunningham Virus and Progressive Multifocal Leukoencephalopathy: A Falsely Played Diagnosis. Diseases 2024, 12, 100. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Advantages and disadvantages of chatbots in medicine and cancer.
Figure 1. Advantages and disadvantages of chatbots in medicine and cancer.
Jpm 14 00877 g001
Figure 2. Current challenges and obstacles of chatbots in medicine and cancer.
Figure 2. Current challenges and obstacles of chatbots in medicine and cancer.
Jpm 14 00877 g002
Table 1. The five main types of chatbots on health and their certain recommended apps.
Table 1. The five main types of chatbots on health and their certain recommended apps.
Knowledge domain
  • The open domain responds to broader categories that can be easily searched within databases, and it may be preferred in cases of routine screening of the symptomatology or connecting to providers/services or health-promoting apps.
  • The closed domain responds to direct or difficult questions that require thorough research, and may be utilized for a treatment plan or recommendations.
Service-provided
  • The interpersonal services are mostly chosen to disseminate information in intimate connections and may be preferred in imaging diagnostics or hereditary evaluation whose purpose is to relay real information to users.
  • The intrapersonal services are used for support and communication purposes, and they may be preferred for counseling, emotional support, and health promotion that requires some human touch.
  • The interagent services are preferred for communication with other chatbots/computing systems, and they may be the preferred chatbot type for administration when transferring patient data among locations.
Goal-based
  • The informative type provides data from warehouse databases/inventory entry, and it may be chosen for linking patients with resources or the remote monitoring of patients.
  • The conversational type’s purpose is to converse with users in a natural base, and it may be used in counseling, support, and the promotion of health.
  • The task-based type is single-acting in predetermined cases, and it is preferred in screening and diagnostics.
Response generation
  • It uses certain algorithms in narrow domains, and efficient evidence is available to train the system, and it is again used for screening and diagnostics.
Human-aided
  • It incorporates human computation with increased flexibility and robustness but decreased speed, and it is preferred for most apps except for support and workflow efficiency that require speed so as to deliver care immediately.
Table 2. The cases of applying the commonly used chatbots in oncology.
Table 2. The cases of applying the commonly used chatbots in oncology.
Screening and diagnosisImaging diagnosticMedical SieveAssesses radiologic data to help physicians’ diagnosis
Symptom screeningQuroPresynopsis relying on symptomatology and history to predict user’s situation
Buoy HealthAids in finding the etiology of diseases and provides medical recommendations
Harshitha breast cancer screeningDialog flow providing an early diagnosis of symptomatology of breast cancer
BabylonSymptom checker
Healthily (formerly Your.md)
Ada
Hereditary assessmentItRunsCollects family history data at a populace level to assess hereditary cancer
TreatmentPatient treatment recommendationMathewDetects symptomatology and predicts the illness via a symptom–illness data set and provides appropriate treatment
MadhuOffers a catalog of available therapies for different illnesses and provides data to the user for the synthesis and prescribed use of medications
Connecting patients with providers or resourcesDivyaEngages users to provide a personalized diagnosis and links to suitable healthcare
RarhiOffers a symptom-based diagnosis and counts the seriousness and links with a physician
Physician medication planningWatson for OncologyEvaluates information from medical records to provide a medication plan for physicians
MonitoringRemote patient monitoringSTREAMDOffers entry to healthcare instructions and educational data
Conversa
Memora Health
AiCureCoaches patients to control their condition and follow the instructions
InfinityEstimates outcomes and consequences of mobile-based monitoring for cancer cases aged ≥65 years
Vikinforms about cases’ routine needs and concerns
SupportCounselingVivobotCognitive and behavioral psychological help
Emotional supportYouperDaily support and tracking mental condition
Wysa
Replika
Unmind
Shim
Woebot
Workflow efficiencyAdministrationSense.lyControls appointments and patients’ conditions and recommends medications
CareSkoreMonitors vitals and informs the need for admissions to the hospital
MandyAids healthcare staff by automating case’s intake process
patient encounterHOLMeShelps diagnosis, chooses the suitable medications and offers preventative check-ups
Health promotionGeneral lifestyle coachingSWITCHesMonitors case’s progress, offers help to physicians, and recommends appropriate activities
CoachAI
WeightMentorOffers self-help activation for maintaining body weight and an open dialogue
Healthy eatingHealthy HeroAids in deciding for foods to alter unhealthy eating habits
Tasteful BotCognitive behavioral medication
Forksy
SLOWbot
Smoking cessationSMAG
BellaHelps stopping smoking
Table 3. The major implications of the chatbots for cancer.
Table 3. The major implications of the chatbots for cancer.
Medical SieveAdvantagesDisadvantages
QuroAutomation, 24/7 availability, low operational costs, scalability, uniform responses, data processing, multilingual, remote accessLimited understanding in complex queries, contextual awareness, inaccuracies, required maintenance, data privacy, cybersecurity, biases, transparency, automation impact, empathy
Buoy Health24/7 availability, remote access, symptom checking, tailored advice, time saving, resource optimization, informed choices, next steps guidanceDiagnostic limitations, algorithm dependence, privacy and security concerns, limited scope, over-reliance, technical issues, ethical and regulatory challenges like bias and fairness
Harshitha breast cancer screeningEarly detection, increased awareness, self-examination, improved imaging techniques, risk reductionFalse positives and false negatives, overdiagnosis of non-threatening cancers, impact on the overall quality of life, radiation exposure, financial burden, accessibility issues
Babylon24/7 availability, remote access, reduced waiting times, quick symptom checking, affordable services, insurance integration, tailored advice, health monitoring, data-drivenDiagnostic limitations in complex cases, privacy and security issues, technology-dependent, app malfunctions, limited human touch, communication barriers, regulatory compliance, biases and fairness
Healthily (former Your.md)24/7 availability, remote access, free services, reduced health costs, user-friendly interface, tailored advice, personal health records, data protectionDiagnostic limitations, algorithm dependence, limited scope, lack of human touch, communication barriers, internet-dependent, technical issues, over-reliance on self-diagnosis
Ada24/7 availability, remote access, user-friendly interface, personalized health assessments, educational resources, privacy and securityDiagnostic limitations, algorithm-dependent, limited scope, lack of human touch, communication barriers, technology/internet-dependent, over-reliance on self-diagnosis
ItRunsEarly detection and prevention, tailored information, customized recommendations, advanced testing, broad coverage, awareness and understanding, supportive guidance, data security and privacyPsychological impact, false positives and false negatives, complex interpretation, high costs, accessibility issues, privacy and ethical concerns, uncertain outcomes, scope limitations
MathewEarly detection and prevention, tailored healthcare plans, risk reduction strategies, targeted therapies, detailed risk assessment, family health insights, informed decision-making, health literacy, at-home testing, online supportPsychological impact, false positives and false negatives, complex results, high costs, accessibility, data privacy, genetic discrimination, variable risk, incomplete coverage
MadhuEarly detection and prevention, tailored recommendations, custom health plans, informed decisions, health awareness, at-home testing, online results, data protectionPsychological impact, false positives and false negatives, misinterpretations, financial barriers, accessibility issues, data privacy, genetic discrimination, partial coverage, lifestyle and environmental factors
DivyaProactive health management, customized preventive strategies, tailored recommendations, custom health plans, at-home testing, online results and support, informed decision-making, health literacy, data protectionPsychological impact, false positives and false negatives, misinterpretations, financial barriers, accessibility issues, data privacy, genetic discrimination, partial coverage, lifestyle and environmental factors
RarhiProactive health management, preventive interventions, customized recommendations, tailored health plans, at-home testing kits, online results, understanding genetic risks, health awareness, data protectionPsychological impact, false positives and false negatives, complex results, financial barriers, limited access, data privacy, genetic discrimination, partial coverage, lifestyle and environmental factors
Watson for OncologyEvidence-based recommendations, up-to-date information, rapid analysis, time saving, integration of multimodal data, personalized treatment, knowledge sharing, training and supportPerformance variability, error rate, health system integration issues, data compatibility, data limitations, dependence on data quality, high costs, training and support, liability issues, patient privacy
STREAMDcomprehensive data view, improved data access, predictive analytics, personalized care, streamlined processes, clinical alerts, patient portal, health monitoringdata quality, false positives and false negatives, system compatibility, data standardization, data privacy, risk of breaches, high costs, training needs, technical issues, over-reliance on AI
ConversaImproved communication, patient education, remote access, 24/7 availability, tailored interactions, health monitoring, streamlined processes, data integration, actionable insights, performance trackingSystem compatibility, technical issues, data privacy, regulatory compliance, implementation costs, subscription fees, learning curve, resistance to change, reduced face-to-face interactions, possible miscommunication
Memora HealthPersonalized communication, improved adherence, 24/7 availability, remote monitoring, automated workflows, data integration, centralized communication, collaborative tools, actionable analytics, performance trackingSystem compatibility, technical issues, data privacy, regulatory compliance, implementation costs, subscription fees, user adoption and training–learning curve and resistance to change, over-reliance on automation, miscommunication risks
AiCureReal-time monitoring, adherence support, interactive interface, behavioral insights, data-driven outcomes, automated process, remote management, improved data accuracy, real-time feedbackData privacy, compliance issues, system compatibility, technical issues, user adoption and training–learning curve and resistance to technology, AI limitations, false positives and false negatives, implementation costs, subscription fees
InfinityComprehensive data integration, streamlined processes, tailored recommendations, adaptive algorithms, data-driven insights, predictive analytics, remote access, 24/7 availability, enhanced data collection, real-time monitoringData privacy, regulatory compliance, system compatibility, technical glitches, high costs, subscription fees, user adoption and training–learning curve and resistance to change, dependence on data quality, possible misinterpretations
VikPersonalized recommendations, 24/7 accessibility, automated processes, streamlined workflows, actionable analytics, predictive capabilities, remote access, user-friendly interfaceData protection, compliance, system compatibility, technical problems, high implementation costs, subscription fees, training needs, resistance to change, dependence on data quality, possible misinterpretations
VivobotAI-powered conversations, behavioral insights, medication reminders, health monitoring, streamlined engagement, data collection, accessible everywhere, real-time feedbackData security, regulatory compliance, AI limitations, technical issues, implementation costs, ongoing fees, user adoption and training–learning curve, resistance to new technology, dependence on data accuracy, miscommunication risks
YouperTailored interactions, adaptive learning, 24/7 availability, easy to use, mood monitoring, data insights, real-time feedbackLimited understanding, miscommunication risks, data privacy and security, compliance issues, resistance to technology, technology-dependent, subscription fees
WysaEvidence-based cognitive behavioral therapy and therapeutic exercises, 24/7 accessibility, user-friendly, custom interactions, mood tracking, connection to human therapistsLack of human empathy, misunderstanding risks, sensitive information, compliance issues, high cost for premium features, technology-dependent
ReplikaCompanionship, conversational AI, self-improvement tools, emotional insights, 24/7 availability, easy to use, personalized experience,Limited emotional depth, miscommunication risks, data privacy and security, regulatory compliance, AI-dependent, in-app purchases
UnmindHolistic approach, evidence-based content, tailored for organizations, engagement metrics, 24/7 accessibility, user-friendly, proactive approachSensitive data, compliancy, privacy and security concerns, genetic resources, employee participation, subscription fees
ShimCustomized interactions, behavioral insights, 24/7 availability, user-friendly, immediate support, adaptabilityComplex issues misinterpretation, miscommunication risk, sensitive data handling, compliance, privacy and security concerns, user engagements, potential costs for premium features/services
WoebotEvidence-based cognitive behavioral therapy, research backed, 24/7 support, user-friendly, real-time conversations, affordable accessLimited depth, miscommunication risks, data privacy and security, regulatory compliance, resistant to technology, dependency risk
Sense.lyVirtual assistant, real-time interaction, patient tracking, data integration, accessible anywhere, 24/7 availability, personalized interactionsAΙ limitations in understanding complex conditions, possible miscommunications, data privacy and security, regulatory compliance, system compatibility, technical issues, implementation costs
CareSkoreRisk evaluation, personalized recommendations, holistic view, continuous monitoring, intuitive design, accessibility, early interventionSensitive data handling, regulatory compliance, dependence on data quality, misinterpretation risks, subscription fees, adherence challenges
MandyTailored resources, adaptive learning, 24/7 access, user-friendly, variety of resources, behavioral insights, data protectionAI limitations in understanding complex issues, miscommunication risks, user motivation, cost consideration for some premium features, technology-dependent
HOLMeSPersonalized interventions, health tracking, comprehensive health data, real-time updates, interactive tools, educational content, multi-platform availabilitySensitive information, regulatory compliance, user learning curve, integration challenges, complexity, implementation costs, AI algorithm accuracy
SWITCHesPersonalized coaching, goal setting, interactive features, feedback mechanisms, 24/7 availability, user-friendly, evidence-based scientific principles Privacy and security, data management, regulatory compliance, user adherence, subscription fees, personalization AI limits
CoachAITailored recommendations, adaptive learning, 24/7 accessibility, user-friendly, data-driven insights, real-time feedback, scalabilityUnderstanding complex needs, miscommunication risks, data privacy and security, regulatory compliance, motivation challenges, subscription costs, usage fees
WeightMentorCustomized plans, adaptive feedback, comprehensive tracking, data insights, motivation tools, educational content, ease to useComplex AI needs, misinterpretation risks, sensitive data handling, regulatory compliance, consistency, subscription fees
Healthy HeroComprehensive wellness, personalized recommendations, informative content, interactive features, goal setting, progress insights, motivation toolsPersonalization limits, possible miscommunication, data privacy and security, regulatory compliance, consistency, cost for premium features
Tasteful BotNutritional guidance with healthy recipes and nutritional information, dietary preferences based on user and adaptive suggestions, easy access, 24/7 availability, interactive experienceLimitations in complex dietary needs, risks for inaccurate recommendations, data management, regulatory compliance, adherence challenges, costs for premium features
ForksyTailored recommendations, adaptive algorithms, comprehensive health view, real-time monitoring interactive features, educational content, accessible anywhereLimitations in complex conditions, data accuracy, data privacy and security, regulatory compliance, consistency, potential fees
SLOWbotSlow-paced interactions, stress reduction, engaging experience, personalization, accessible anytime, easy to use, mindfulness resources,Depth of support, possible miscommunication, user adherence, data management, niche focus
SMAGReal-time tracking, adaptive recommendations, holistic view, actionable insights, interactive tools, educational resources, accessible across devicesLimitations in accuracy and interpretation, complex health issues, sensitive data, data privacy and security issues, regulatory compliance, subscription fees, adherence
BellaComprehensive approach, personalized interactions, engaging experience, real-time feedback, informative content, 24/7 access, user-friendlyDepth of support, miscommunication risks, data privacy and security issues, consistency, cost for premium features
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Goumas, G.; Dardavesis, T.I.; Syrigos, K.; Syrigos, N.; Simou, E. Chatbots in Cancer Applications, Advantages and Disadvantages: All that Glitters Is Not Gold. J. Pers. Med. 2024, 14, 877. https://doi.org/10.3390/jpm14080877

AMA Style

Goumas G, Dardavesis TI, Syrigos K, Syrigos N, Simou E. Chatbots in Cancer Applications, Advantages and Disadvantages: All that Glitters Is Not Gold. Journal of Personalized Medicine. 2024; 14(8):877. https://doi.org/10.3390/jpm14080877

Chicago/Turabian Style

Goumas, Georgios, Theodoros I. Dardavesis, Konstantinos Syrigos, Nikolaos Syrigos, and Effie Simou. 2024. "Chatbots in Cancer Applications, Advantages and Disadvantages: All that Glitters Is Not Gold" Journal of Personalized Medicine 14, no. 8: 877. https://doi.org/10.3390/jpm14080877

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop