Next Article in Journal
A Case of Life-Threatening Bleeding Due to a Locally Advanced Breast Carcinoma Successfully Treated with Transcatheter Arterial Embolization
Previous Article in Journal
CT Texture Analysis of Adrenal Pheochromocytomas: A Pilot Study
Previous Article in Special Issue
Towards Precision Oncology: Enhancing Cancer Screening, Diagnosis and Theragnosis Using Artificial Intelligence
 
 
Opinion
Peer-Review Record

The Use of Artificial Intelligence in Clinical Care: A Values-Based Guide for Shared Decision Making

Curr. Oncol. 2023, 30(2), 2178-2186; https://doi.org/10.3390/curroncol30020168
by Rosanna Macri 1,2,3,* and Shannon L. Roberts 4
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Curr. Oncol. 2023, 30(2), 2178-2186; https://doi.org/10.3390/curroncol30020168
Submission received: 20 December 2022 / Revised: 28 January 2023 / Accepted: 1 February 2023 / Published: 9 February 2023

Round 1

Reviewer 1 Report

The article lists patient values associated with the use of AI in clinical decision-making. These values mirror the general concerns associated with the use of AI and which are well documented in the literature. These concerns have prompted calls for the adherence to principles for the responsible use of AI, see for example the ACM statement on responsible algorithmic systems: https://www.acm.org/articles/bulletins/2022/november/tpc-statement-responsible-algorithmic-systems

The review lacks a robust discussion of patient values, their intertwined natures, and how they are currently given consideration within the clinical context of health care delivery. Systematic reviews are expected to follow the PRISMA guidelines (BMJ 2021;372:n71 ).

Most of the questions included in the proposed tool would have been addressed as part of the process of AI technology adoption in the clinic. Some of these questions are more relevant to the requisite training of healthcare professional in the use of AI systems.  More importantly, shared clinical decision-making rests on three inextricably intertwined elements: (1) patient quality of life considerations/wishes (demanded by the patient), medical considerations (advocated /championed by the medical practitioners), and the AI recommendations (dispensed by the AI systems). A guide/tool for shared decision-making when using AI in clinical care must, at a minimum, offer insights about this tridimensional decision-making problem and its tri-partite conflicting objectives and constraints (clinical, workflow, training, infrastructure, patient familiarity with AI…etc.), and provide corresponding, actionable steps to foster shared-decision making.

Author Response

Thank you for your insightful feedback.

 

As you are aware, the paper required major revisions. We appreciate your patience. We want to offer this draft to determine if your main concerns have been addressed.

 

Page and line numbers mentioned throughout the responses refer to ‘track changed’ version of the manuscript.

 

Reviewer 1

 

  1. “The article lists patient values associated with the use of AI in clinical decision-making. These values mirror the general concerns associated with the use of AI and which are well documented in the literature. These concerns have prompted calls for the adherence to principles for the responsible use of AI, see for example the ACM statement on responsible algorithmic systems: https://www.acm.org/articles/bulletins/2022/november/tpc-statement-responsible-algorithmic-systems”

 

We have added a sentence to the introduction (page 1, lines 39-41) and values section (page 3, lines 123-125) to acknowledge this. Although the patient values associated with the use of AI in clinical care are well documented in the literature, we identified a gap to be addressed is the implementation of patient values in shared decision-making when using AI in clinical practice. We are suggesting a way that they can be incorporated.

 

  1. “The review lacks a robust discussion of patient values, their intertwined natures, and how they are currently given consideration within the clinical context of health care delivery.”

 

The purpose of the commentary was to suggest a way to engage clinicians in values-based shared decision-making when considering the use of AI in clinical care. We have rewritten, and are continuing to rethink the section on patient values to address the reviewer’s suggestions (page 3, lines 101-137). Your feedback on the direction we are now taking is valued.  

 

  1. “Systematic reviews are expected to follow the PRISMA guidelines (BMJ 2021;372:n71).”

 

The paper was not meant to be a systemic review. However, we appreciate that the style used for the previous version may lead the reader to that conclusion. We have redrafted the commentary to follow a more appropriate style. 

 

  1. “Most of the questions included in the proposed tool would have been addressed as part of the process of AI technology adoption in the clinic. Some of these questions are more relevant to the requisite training of healthcare professional in the use of AI systems. More importantly, shared clinical decision-making rests on three inextricably intertwined elements: (1) patient quality of life considerations/wishes (demanded by the patient), medical considerations (advocated /championed by the medical practitioners), and the AI recommendations (dispensed by the AI systems). A guide/tool for shared decision-making when using AI in clinical care must, at a minimum, offer insights about this tridimensional decision-making problem and its tri-partite conflicting objectives and constraints (clinical, workflow, training, infrastructure, patient familiarity with AI…etc.), and provide corresponding, actionable steps to foster shared-decision making.”

 

We have, and continue to revise the values section to address competing values, objectives and constraints (page 3, lines 101-137). We have replaced the questions to consider in the table with Appendix A, which offers the clinician a practical guide (actionable steps) to engage in a shared decision-making conversation with patients when considering the use of AI in clinical care. The guide is inspired by similar processes used in other areas of shared decision-making in healthcare that incorporate patient values (e.g., resources such as ethical decision-making tools or frameworks, serious illness conversation guides, advance care planning resources, and goals of care discussions).

Reviewer 2 Report

This is an interesting study, but the article does not describe the outline of AI application in  clinical care and the detailed algorithm  in valued tool. you should add this in the revision.in addition, please review the tables in the article and make it concise and readable.

Authors showed the Use of Artificial Intelligence in Clinical Care

Based Tool to Guide Shared Decision-Making, however the Shared Decision-Making is not clear to introduce to oncology reader, the clinical doctor focused on how to acquire decision making from AI .

It is novel method.

This study shows a novel method to Use of Artificial Intelligence in Clinical Care
It need improve to be concise and clear to reader.
The references are appropriate
The tables need be rephased as concise as possible.

Author Response

Thank you for your insightful feedback.

As you are aware, the paper required major revisions. We appreciate your patience. We want to offer this draft to determine if your main concerns have been addressed.

Page and line numbers mentioned throughout the responses refer to ‘track changed’ version of the manuscript.

Reviewer 2

 

  1. “This is an interesting study, but the article does not describe the outline of AI application in clinical care and the detailed algorithm in valued tool. you should add this in the revision. in addition, please review the tables in the article and make it concise and readable.”

 

Appendix A offers the clinician a guide to engage in a shared decision-making conversation with patients when considering the use of AI in clinical care. The updated guide helps the clinician incorporate the patient’s values into the shared decision-making. We have edited the table to focus on predominant patient values as identified in the literature and make it more concise and readable.

 

  1. “Authors showed the Use of Artificial Intelligence in Clinical Care Based Tool to Guide Shared Decision-Making, however the Shared Decision-Making is not clear to introduce to oncology reader, the clinical doctor focused on how to acquire decision making from AI.”

 

We have added a section to the manuscript describing shared decision-making (page 2, line 56- page 3, line 99) and provided recommendations with a practical guide (Appendix A) explaining how clinicians can engage in shared decision-making conversations with patients and incorporate their values when using AI in their clinical care (page 4, line 140 – page 5, line 183).

 

  1. “It is novel method.

This study shows a novel method to Use of Artificial Intelligence in Clinical Care

It need improve to be concise and clear to reader.

The references are appropriate

The tables need be rephased as concise as possible.”

We have edited the manuscript and table to make them clearer and more concise for the reader.

 

 

Reviewer 3 Report

In the present study, Macri and Roberts have reviewed the role of AI in clinical care. The work is impressive, however, there are some minor suggestions to improve the manuscript quality.

 

- Discussion section, Highlight the result in more detail, mentioning the limitations and future outlook.

- Revise the manuscript to improve the written structure and clear all vague sentences and grammar.

- A table should be provided to cover recent efforts in AI in clinical care, which must contain their strengths and weaknesses.

Author Response

Thank you for your insightful feedback.

As you are aware, the paper required major revisions. We appreciate your patience. We want to offer this draft to determine if your main concerns have been addressed.

Page and line numbers mentioned throughout the responses refer to ‘track changed’ version of the manuscript.

Reviewer 3

“In the present study, Macri and Roberts have reviewed the role of AI in clinical care. The work is impressive, however, there are some minor suggestions to improve the manuscript quality.”

  1. “Discussion section, Highlight the result in more detail, mentioning the limitations and future outlook.”

We have added to the discussion section to highlight the result in more detail (see recommendations section, page 4, line 140 – page 5, line 183) and added a few sentences to the conclusions about future directions (page 5, lines 192-194). As the paper is meant to be a commentary suggesting a way to engage clinicians in values-based shared decision-making when considering the use of AI in clinical care, we did not include limitations.

  1. “Revise the manuscript to improve the written structure and clear all vague sentences and grammar.”

We have edited the manuscript as requested.

  1. “A table should be provided to cover recent efforts in AI in clinical care, which must contain their strengths and weaknesses.”

This is out of the scope of our manuscript. In light of the major revisions since your last review, if you think it is still within scope, please describe your expectations in more detail.

 

 

 

 

Round 2

Reviewer 1 Report

The authors’ revisions are adequate.

Reviewer 2 Report

accept

Back to TopTop