Next Article in Journal
Advancements in Remote Alpha Radiation Detection: Alpha-Induced Radio-Luminescence Imaging with Enhanced Ambient Light Suppression
Previous Article in Journal
Synchronization of Neurophysiological and Biomechanical Data in a Real-Time Virtual Gait Analysis System (GRAIL): A Proof-of-Principle Study
Previous Article in Special Issue
Sensor-Based Quantification of MDS-UPDRS III Subitems in Parkinson’s Disease Using Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Perspective

Breaking down the Digital Fortress: The Unseen Challenges in Healthcare Technology—Lessons Learned from 10 Years of Research

by
Alison Keogh
1,2,
Rob Argent
2,3,
Cailbhe Doherty
2,4,*,
Ciara Duignan
2,
Orna Fennelly
2,
Ciaran Purcell
2,5,
William Johnston
2 and
Brian Caulfield
2,4
1
Clinical Medicine, School of Medicine, Trinity College Dublin, Tallaght University Hospital, D24 TP66 Dublin, Ireland
2
Insight Centre for Data Analytics, University College Dublin, D04 V1W8 Dublin, Ireland
3
School of Pharmacy and Biomolecular Sciences, RCSI University of Medicine & Health Sciences, D02 YN77 Dublin, Ireland
4
School of Public Health, Physiotherapy and Sports Science, University College Dublin, D04 V1W8 Dublin, Ireland
5
School of Allied Health, University of Limerick, V94 T9PX Limerick, Ireland
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(12), 3780; https://doi.org/10.3390/s24123780
Submission received: 24 April 2024 / Revised: 6 June 2024 / Accepted: 8 June 2024 / Published: 11 June 2024

Abstract

:
Healthcare is undergoing a fundamental shift in which digital health tools are becoming ubiquitous, with the promise of improved outcomes, reduced costs, and greater efficiency. Healthcare professionals, patients, and the wider public are faced with a paradox of choice regarding technologies across multiple domains. Research is continuing to look for methods and tools to further revolutionise all aspects of health from prediction, diagnosis, treatment, and monitoring. However, despite its promise, the reality of implementing digital health tools in practice, and the scalability of innovations, remains stunted. Digital health is approaching a crossroads where we need to shift our focus away from simply looking at developing new innovations to seriously considering how we overcome the barriers that currently limit its impact. This paper summarises over 10 years of digital health experiences from a group of researchers with backgrounds in physical therapy—in order to highlight and discuss some of these key lessons—in the areas of validity, patient and public involvement, privacy, reimbursement, and interoperability. Practical learnings from this collective experience across patient cohorts are leveraged to propose a list of recommendations to enable researchers to bridge the gap between the development and implementation of digital health tools.

1. Introduction

The proliferation of digital health technologies is generally accepted to be a revolutionary development signifying a fundamental paradigm shift in how healthcare operates [1]. Digital health is a broad term encompassing electronically captured data, along with technical and communications infrastructure and applications in the healthcare ecosystem [1]. Advances in data analytics, wearable devices, artificial intelligence, and more are packaged as solutions which will improve efficiency and connect and empower stakeholders through proactive data sharing in a timely, flexible, and integrated manner [1,2,3,4]. Commercially, technology giants such as Apple, Google, Huawei, and Samsung are adding their weight to the system, offering health and performance measurements for users to monitor themselves, a successful strategy demonstrated by their market value, which is expected to reach USD 639.4 billion by 2026 [5,6,7].
Despite its promise however, the digital health ecosystem remains murky, complex, and confusing. To date, digital health technologies have failed to demonstrate themselves as drivers of patient behaviour change [8,9,10], while sustained engagement with technologies has either been difficult to achieve, or ultimately, is not the aim, making it unclear how sustainable and scalable some solutions may be [11,12]. Furthermore, implementing digital health technologies into routine care is fragmented, owing to various systemic issues across jurisdictions. In essence, digital health is increasingly finding itself at a crossroads where it seeks to balance the continued development of innovative solutions with the real-world consequences and required adaptations necessary to actualise its potential.
Consequently, it is time for digital health researchers to pause and take stock of where we are, and where we wish to go, in this mission to improve healthcare. The authors are a group of researchers with over 10 years of experience in digital health research, specifically in the domain of the development of pre-commercial solutions, or the evaluation of technologies which are already commercially available. We have reflected on our experiences to date and identified five key lessons that we feel are currently limiting the potential for digital health technologies to develop further. In this perspective piece, we outline these lessons learned and offer recommendations for future research which we believe are fundamental to realise the potential for digital health technologies (Table 1).

2. Lessons Learned

2.1. Lesson 1: Validity Needs Revitalising to Compete with the Commercial Ecosystem

Despite the ubiquitous presence of digital health technologies, a big question remains: Can they provide valid and reliable estimations of biometric data? Validity is the foundation upon which the development of evidence-based interventions—and advancements in healthcare—are built. Despite its importance, validation poses challenges due to the dynamic nature of the sector and the distinct validation stages needed to demonstrate reliability to foster confidence in measurements [13,14].
We have undertaken validation at various stages of development [15,16,17,18,19,20,21,22,23,24,25,26,27] (Table 2). Each stage comes with its own challenges, not least the time needed to ensure each step is completed robustly [13,14,15]. This is true when developing any new hardware or software, or when independently testing existing commercial products. Consequently, traditional research dissemination methods struggle to keep pace with a commercial industry where hardware iterations are frequent and software updates, which can incorporate new processing strategies, often occur multiple times within a year. The result of this discrepancy is a lack of confidence and, thus, a potential lack of applicability for emerging technologies. Indeed, a recent umbrella review assessing the validity of consumer wearables indicated that devices show significant inaccuracies for certain metrics, particularly for the estimation of energy expenditure, step counting, and sleep and heart rate during vigorous activity [28].
Benchtop testing offers quality assurance at the basic physical unit level and can lead to the validation of higher-level measures at later stages, yet the potential for competitive advantage conflicts might deter companies from adopting this testing method. Furthermore, the demonstration of validity, and indeed the development of machine learning algorithms at this point, does not always translate to real-world validity. Nevertheless, we find the possibility of citizen science promising. Users, already equipped with devices, could contribute their data for research, strengthening validation exercises [29]. Several research institutions and companies are now embracing this approach, known as data altruism (https://shil.stanford.edu/myphd/ [accessed on 15 December 2023]; https://allofus.nih.gov/ [accessed on 15 December 2023]; https://wetrac.ucalgary.ca/ [accessed on 15 December 2023]; https://www.ukbiobank.ac.uk/ [accessed on 15 December 2023]; https://tryvital.io/ [accessed on 15 December 2023]; https://thryve.health/ [accessed on 15 December 2023]; https://www.fitabase.com/ [accessed on 15 December 2023]; https://www.labfront.com/ [accessed on 15 December 2023]; https://www.fitrockr.com/ [accessed on 15 December 2023]). However, such ‘agile’, real-world validation methods nonetheless require standardised device- and outcome-specific assessment protocols to allow pooling and comparison of data.
Herein lies the call to action for researchers: foster partnerships with research groups in other institutions and companies, and develop validation protocols that can leverage real-world data, to fast-track validation and encourage public engagement in validation studies. This collaborative effort can yield transparency, troubleshoot performance issues, and potentially offer cost savings during development. For clinicians, understand that validation is a continuous process that seeks to ensure data integrity and reliability. As digital health technologies become more intertwined with patient care, critically appraising these tools for their validity is vital to maintain patient safety and data reliability. Furthermore, there is a need for greater transparency in reporting validation methods, likely through the development of agreed standards of reporting.

2.2. Lesson 2: Patients Need to Be Our Partners, Not Simply Our End-Users

Irrespective of the effectiveness and validity of a technology, if the intended user is unable, or not motivated, to interact with it, it will not succeed in changing outcomes. We have gathered ample evidence that people see value in remotely gathering their health information [17,20,30,31,32,33,34,35]; however, the current reality is that monitoring may not meet expectations or may fail to answer the questions that users have [31,36,37,38,39]. True value and the focus on patient needs may get lost during the development process when the focus is typically on technical elements, while the unmet need for digital interventions is rarely considered [1,40]. We have found that usability of wearables is not formally tested or is tested in a manner that is, at best, basic in nature [30], while pilot testing of devices is rarely undertaken [37]. Thus, our experiences suggest that while many technologies are designed with patients in mind, they are not being designed with patients. This is leading to solutions which may frustrate users, which are not fit for purpose, cannot be implemented successfully, or which fail to live up to their promise.
We have extensive evidence of working alongside patients in the development of digital health technologies to monitor various conditions including knee replacement rehabilitation [17], heart failure self-management [20,35], and real-world digital mobility outcome measures [41]. We have engaged with patients across each of the domains of patient and public involvement (PPI) in our work (https://www.nihr.ac.uk/documents/briefing-notes-for-researchers-public-involvement-in-nhs-health-and-social-care-research/27371 [accessed on 15 December 2023]), mostly within the Mobilise-D consortium, a public–private partnership which has developed digital mobility outcome measures of real-world walking across multiple patient cohorts. This has led to the identification of PPI recommendations [41], changes to protocols, public facing dissemination activities, and more (https://youtu.be/qTazIpSC4DU?si=WwKYSKY2xBu2J2pe [accessed on 10 January 2024]; https://youtu.be/hherCpNiKLw?si=01E5EwPM2ww-xmc_ [accessed on 1 June 2024]; https://youtu.be/3FwD9XZynHo?si=nHSkxjqQpQxGJDBp [accessed on 15 December 2023]; https://youtu.be/Y_rfqCROIDQ?si=pRI2Fq0O49Bm5B4M [accessed on 15 December 2023]; https://mobilise-d.eu/ppag-activities-and-contributions/ [accessed on 15 December 2023]). Engaging meaningfully in this user-centred design approach means that the solution might not eventually be the one that was originally envisaged, it might not be a net positive for all types of users, nor might it be simply a digitisation of the current care pathway. However, understanding fundamental needs and barriers and facilitators to solutions may enable better engagement and impact and will certainly result in a reduction in waste.
We consider the continuing lack of PPI to be a significant barrier to the successful implementation of digital health technologies. Funding and regulatory bodies are beginning to acknowledge this by making PPI mandatory in submissions from academia and industry. Consequently, it is imperative that researchers and clinicians include it as standard in their work. We encourage researchers and clinicians to engage with the various bodies and organisations that now exist to support researchers with this. This includes PPI guidelines, patient societies, and bodies who support and train patients to be research partners (https://eupati.eu/ [accessed on 10 January 2024]; https://ipposi.ie/ [accessed on 10 January 2024]) as well as academic institutional supports to support PPI and design thinking training (e.g., https://ppinetwork.ie/ [accessed on 10 January 2024]). Finally, we call on researchers and clinicians to actively challenge industry partners and start-ups about how PPI has been integrated into the design of their solutions prior to implementing them in studies or practice.

2.3. Lesson 3: Digital Health’s Double-Edged Sword—Innovation vs. Privacy

Continuous health monitoring brings with it immense potential but also significant threats to confidentiality and privacy, particularly with the increasing volume of commercial tools. The trajectory of this rapidly evolving ecosystem, swayed by the influence of regulatory architecture, could culminate in either utopian or dystopian outcomes. The former envisages an environment characterised by comprehensive regulations guaranteeing judicious data utilisation for maximum societal benefit, with informed consent and privacy enshrined as fundamental tenets [42,43]. In contrast, the dystopian scenario foreshadows an arena of rampant misuse by healthcare data brokers, unauthorised data access, privacy transgressions, and the commodification of health data, catalyzed by inadequate regulations and discordant international standards [42]. The reality is possibly somewhere in the middle. Technical infrastructure can support the secure sharing of data in locked environments which retain privacy. Currently, we are exploring work in relation to the regulatory and technical safeguards required to do this within Ireland, in a way that supports federated data sharing within the European Union [44,45].
Our interactions with patients suggest that their behavior relates to the trust they have in the person or institution implementing the technology [46]. Specifically, there is an assumption that researchers and healthcare professionals have participants’ best interests at heart and are unlikely to engage with technologies that may put them at risk. However, the privacy paradox has shown that despite concerns, people readily disclose and share data with various companies, including those they have low trust in [47,48,49]. This paradox is domain dependent and is linked to technical literacy [47,48,50], emphasising the great responsibility that falls upon researchers as the gatekeepers of participant’s privacy.
However, within this, we have experienced another conflict that limits the potential for researchers to progress the digital health space. Specifically, in some jurisdictions, although commercial products are available for individuals to use and purchase, when researchers seek to evaluate these same products in studies, they are met with walls of academic institutional privacy barriers. This includes the need for data processing agreements with the companies whose products are being used. Some companies do not wish to formally engage with researchers seeking to independently assess their products and therefore will not enter into data processing agreements with them. Others simply do not see the need for it as their commercially available products are not intended as research tools. Thus, while we acknowledge the importance of thorough data management procedures, we must nonetheless admit that these standards are also limiting the independent testing of existing digital health technologies and consequently reduce our ability to evaluate their effectiveness, validity, and implementation.
In light of this, we propose that all biometric data be considered digital specimens, warranting the same rigor, care, and caution accorded to their physical analogues [43]. Privacy, in this context, transcends the basic need for data protection to encompass the individual’s right to dictate the access, manipulation, and dissemination of their personal data. In the commercial space, the urgent necessity for privacy is underscored by the potential misappropriation by health data brokers and the shortcomings of end-user licence agreements, which tend to prioritise corporate immunity over user protection [43,49]. Consequently, it is incumbent upon researchers and clinicians to adopt a cautious and informed approach when interpreting data from consumer devices presented by patients, considering the source and validity of the data (see lesson 1), while providing counsel on data privacy and protective measures. Finally, in order to ensure that privacy concerns are holistically addressed and allayed, healthcare professionals are urged to engage in a detailed and systematic evaluation of each device’s security measures (Table 3).
In conjunction with this, security issues need to be addressed alongside privacy. Digital health researchers might not possess the specialised skills necessary to thoroughly evaluate the security of digital tools. Recognising this, it is important to highlight existing security standards and processes that healthcare systems and research organisations are adopting to address these challenges. For instance, the Digital Technology Assessment Criteria (DTAC) in the UK provides a standardised framework to assess the security and clinical safety of digital health technologies. Similarly, ORCHA (Organisation for the Review of Care and Health Apps) conducts evaluations of health apps worldwide to ensure they meet predefined security and privacy standards. Although these processes have their limitations and challenges, they represent significant steps toward systematising security assessments in digital health. At the institutional level, many research organisations and universities have established security review protocols and requirements. These internal reviews are important for ensuring that digital health technologies used in research comply with necessary security standards, thus mitigating risks associated with data breaches and unauthorised access.

2.4. Lesson 4: The Interplay of Commercialisation and Reimbursement in Shaping Digital Health’s Real-World Reach

Widespread adoption of digital health technologies requires a business model that is suitable for all stakeholders, a method of reimbursement that sustains their development and implementation beyond the life of a research grant. A number of major players who have attracted investment in recent years have pivoted from their initial offering to provide a sustainable business model which delivers a new care pathway, rather than offering a technology to be embedded within an existing one (e.g., https://www.hingehealth.com [accessed on 15 December 2023]; https://swordhealth.com [accessed on 15 December 2023]). Whilst these pivots have been innovative, and potentially more disruptive than their initial offering, they were largely driven by the need for a sustainable reimbursement model that provides cost-effectiveness for all.
Despite positive outcomes in early research stages, projects often fail to achieve adoption as they are not financially sustainable. This is certainly the experience of the authors, who have explored commercial opportunities of research outputs in the domain of physical therapy and found the primary stumbling block to be the prevalence of fee-for-service models, a barrier that has been previously highlighted elsewhere [51]. In many cases, where digital health technologies can lead to proactive and preventative healthcare management, they seek to reduce the utilisation of services or contact points in the system. For many using a fee-for-service model, increasing expenditure for a tool which improves efficiency but reduces the number of clinic visits, and therefore income, is counterintuitive. Whilst the technology may provide better patient care, there is no motivation for the buyer to adopt the system into practice.
Conversely, in public health systems, the motivation to improve efficiency can lead to cost savings. The challenge for achieving implementation though is in gathering the required evidence to prove cost-effectiveness, which can take many years and, as such, requires a large amount of upfront investment and associated risk. Consequently, in the author’s experiences, the first point of entry of new technologies is rarely public health systems. As a result, rather than revolutionising healthcare, digital health technologies developed in fee-for-service models actually risk increasing health inequality and the digital divide, rather than reducing it [52,53,54,55,56]. There are growing moves towards bundled payment models in the form of value-based care [51,57], where payment is based on the outcome of care, rather than the quantity, thus providing motivation to offer the most efficient service whilst still delivering high-quality care. There is a need for regulators, national governments, departments of health, etc., to become more closely invovled in planning for such shifts in policy to effect meaningful change.
We consider the use of appropriate reimbursement models to be critical to facilitate the adoption of digital health technologies, and we recommend all researchers and clinicians consider cost-effectiveness when designing or selecting a digital health technology to implement [58]. Researchers should consider the variety of reimbursement mechanisms, how they differ between jurisdictions, and the evidence requirements associated with each. All stakeholders can actively lobby authorities to adapt their reimbursement mechanisms to embrace the opportunity of digital health, whether it is value-based care or the successful DiGA framework in Germany [59], which allows for the prescription and payment of digital health interventions to be funded much like pharmacological interventions.

2.5. Lesson 5: Digital Health’s Future Hinges on Interoperability

The true potential for innovation lies in the realm of interoperability. Serving as the fundamental cornerstone for effectively harnessing digital health data, the aim of interoperability is to bridge the chasm that exists between insular data repositories and individual health technologies. As it stands however, the current digital health landscape is more reminiscent of a mosaic of disjointed ‘small data’, as opposed to the idealised concept of ‘big data’. Indeed, our own research projects have highlighted the barriers that exist to the adoption and usefulness of digital health technology, as a result of siloed information that is difficult for anyone other than end-users to access or act upon [17,33,34,35,36,37]. Furthermore, proprietary systems typically fail to, and are not required to, provide easy access to third parties, thus limiting data flows and innovation. Interoperability can be both syntactic, whereby systems cannot communicate with each other, or semantic, where even if we get access to the data, it is in different formats, which preclude aggregation [44]. Consequently, for digital health to realise its full potential, there is a need to design technologies that facilitate and provide seamless communication across IT systems. Linked to this is the need to establish both standardised data formats internationally [60] and promote consistent use of standardised terminologies such as SNOMED-CT, DICOM, and LOINC where possible [60,61].
In short, drawing from all lessons, the cost-effectiveness and successful implementation of digital health technologies requires a fundamental shift for researchers to solve this current lack of interoperability [62]. Beyond technical requirements, there are many other organisational requirements that are also needed to ensure interoperability. Currently, researchers seek to design tools that are effective, valid, and useful, and then wait to consider where they fit within care pathways, who pays for them, and how they operate within a system. A pivot towards considering interoperability early in the process is crucial for the widespread adoption of digital health technologies, as well as for the general advancement of medical research. Further, interoperability has the capacity to enhance the overall quality of research, as data can be scrutinised by experts globally and across a myriad of sources.
Therefore, interoperability may well hold the key to unlocking the viability of digital health technologies by bolstering their cost-effectiveness and amplifying their capacity to deliver high-quality care. Not-for-profit organisations such as openEHR provide open specifications for the management, storage, and retrieval of data in electronic health records, while international standards for data structure such as Health Level Seven International (HL7), Fast Healthcare Interoperability Resources (FHIR), and Integrating the Healthcare Enterprise (IHE) now exist. Furthermore, the European Health Data Space and other federated data analysis projects (e.g., European Open Science Cloud: https://research-and-innovation.ec.europa.eu/strategy/strategy-2020-2024/our-digital-future/open-science/european-open-science-cloud-eosc_en [accessed on 10 January 2024]) promote the use of these standards which will become mandatory future requirements. We therefore recommend that researchers invest early in backend development to future-proof their infrastructure to be scalable, to have open APIs, and to consider their use beyond their own projects. Once the technology and data are there, we need unique identifiers to link people across datasets, along with incentives and legislation to ensure that sharing occurs and that there is security of the access methods to the data [63].

3. Conclusions

This paper has summarised the collective experiences of a group of digital health researchers to highlight continued barriers and considerations in the effectiveness and implementation of digital health technologies. When standing at a crossroads, we have a choice, continue on as we are or change direction. If digital health research continues on its current path, it risks a never-ending cycle of unfulfilled potential, development without implementation, an on-going conflict between researchers and commercial entities with the patient caught in the middle, and the delivery of fragmented care which increases health inequities and the digital divide. Digital health is a complex, messy, and multi-faceted domain, and targeted changes in the way we conduct research are needed to move us forward. We do not propose to have all the answers; however, we have sought outline key recommendations in the areas of validity, patient and public involvement, cost-effectiveness, privacy, and interoperability, based on our lessons learned, as a call to action for future studies and solution development in this space to implement and make meaningful change to healthcare outcomes.

Author Contributions

Conceptualisation, A.K., R.A., C.D. (Cailbhe Doherty), O.F., C.D. (Ciara Duignan), C.P., W.J. and B.C.; writing—first draft, A.K., R.A., C.D. (Cailbhe Doherty), O.F., C.D. (Ciara Duignan), C.P., W.J. and B.C.; writing—review and editing, A.K., R.A., C.D. (Cailbhe Doherty), O.F., C.D. (Ciara Duignan), C.P., W.J. and B.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Abernethy, A.; Adams, L.; Barrett, M.; Bechtel, C.; Brennan, P.; Butte, A.; Faulkner, J.; Fontaine, E.; Friedhoff, S.; Halamka, J.; et al. The Promise of Digital Health: Then, Now, and the Future. Natl. Acad. Med. Perspect. 2022. [Google Scholar] [CrossRef] [PubMed]
  2. Lupton, D. The digitally engaged patient: Self-monitoring and self-care in the digital health era. Soc. Theory Health 2013, 11, 256–270. [Google Scholar] [CrossRef]
  3. Caulfield, B.M.; Donnelly, S.C. What is Connected Health and why will it change your practice? QJM 2013, 106, 703–707. [Google Scholar] [CrossRef] [PubMed]
  4. Snowdon, A. Digital Health: A Framework for Healthcare Transformation; Healthcare Information and Management Systems Society: Chicago, IL, USA, 2020. [Google Scholar]
  5. Tresp, V.; Overhage, J.M.; Bundschus, M.; Rabizadeh, S.; Fasching, P.A.; Yu, S. Going Digital: A Survey on Digitalization and Large-Scale Data Analytics in Healthcare. Proc. IEEE 2016, 104, 2180–2206. [Google Scholar] [CrossRef]
  6. Peake, J.M.; Kerr, G.; Sullivan, J.P. A Critical Review of Consumer Wearables, Mobile Applications, and Equipment for Providing Biofeedback, Monitoring Stress, and Sleep in Physically Active Populations. Front. Physiol. 2018, 9, 743. [Google Scholar] [CrossRef] [PubMed]
  7. Samriddhi, C.; Roshan, D. Wearable Technology Market, Global Opportunity Analysis and Industry Forecast 2020–2031; Allied Market Research: Portland, OR, USA, 2022. [Google Scholar]
  8. Patel, M.S.; Asch, D.A.; Volpp, K.G. Wearable devices as facilitators, not drivers, of health behavior change. JAMA 2015, 313, 459–460. [Google Scholar] [CrossRef] [PubMed]
  9. McKay, F.H.; Wright, A.; Shill, J.; Stephens, H.; Uccellini, M. Using Health and Well-Being Apps for Behavior Change: A Systematic Search and Rating of Apps. JMIR mHealth uHealth 2019, 7, e11926. [Google Scholar] [CrossRef] [PubMed]
  10. Palacholla, R.S.; Fischer, N.; Coleman, A.; Agboola, S.; Kirley, K.; Felsted, J.; Katz, C.; Lloyd, S.; Jethwani, K. Provider- and Patient-Related Barriers to and Facilitators of Digital Health Technology Adoption for Hypertension Management: Scoping Review. JMIR Cardio 2019, 3, e11951. [Google Scholar] [CrossRef] [PubMed]
  11. Schlieter, H.; A Marsch, L.; Whitehouse, D.; Otto, L.; Londral, A.R.; Teepe, G.W.; Benedict, M.; Ollier, J.; Ulmer, T.; Gasser, N.; et al. Scale-up of Digital Innovations in Health Care: Expert Commentary on Enablers and Barriers. J. Med. Internet Res. 2022, 24, e24582. [Google Scholar] [CrossRef] [PubMed]
  12. Cripps, M.; Scarbrough, H. Making Digital Health “Solutions” Sustainable in Healthcare Systems: A Practitioner Perspective. Front. Digit. Health 2022, 4, 727421. [Google Scholar] [CrossRef]
  13. Keadle, S.K.; Lyden, K.A.; Strath, S.J.; Staudenmayer, J.W.; Freedson, P.S. A Framework to Evaluate Devices That Assess Physical Behavior. Exerc. Sport Sci. Rev. 2019, 47, 206–214. [Google Scholar] [CrossRef] [PubMed]
  14. Ash, G.; Stults-Kolehmainen, M.; Busa, M.A.; Gregory, R.; Garber, C.E.; Liu, J.; Gerstein, M.; Casajus, J.A.; Gonzalez-Aguero, A.; Constantinou, D.; et al. Establishing a Global Standard for Wearable Devices in Sport and Fitness: Perspectives from the New England Chapter of the American College of Sports Medicine Members. Curr. Sports Med. Rep. 2020, 19, 45–49. [Google Scholar] [CrossRef] [PubMed]
  15. Argent, R.; Bevilacqua, A.; Keogh, A.; Daly, A.; Caulfield, B. The Importance of Real-World Validation of Machine Learning Systems in Wearable Exercise Biofeedback Platforms: A Case Study. Sensors 2021, 21, 2346. [Google Scholar] [CrossRef] [PubMed]
  16. Argent, R.; Drummond, S.; Remus, A.; O’Reilly, M.; Caulfield, B. Evaluating the use of machine learning in the assessment of joint angle using a single inertial sensor. J. Rehabil. Assist. Technol. Eng. 2019, 6, 2055668319868544. [Google Scholar] [CrossRef] [PubMed]
  17. Argent, R.; Slevin, P.; Bevilacqua, A.; Neligan, M.; Daly, A.; Caulfield, B. Wearable Sensor-Based Exercise Biofeedback for Orthopaedic Rehabilitation: A Mixed Methods User Evaluation of a Prototype System. Sensors 2019, 19, 432. [Google Scholar] [CrossRef] [PubMed]
  18. Johnston, W.; O’Reilly, M.; Dolan, K.; Reid, N.; Coughlan, G.; Caulfield, C. Objective classification of dynamic balance using a single wearable sensor. In Proceedings of the 4th International Congress on Sport Sciences Research and Technology Support, Porto, Portugal, 7–9 November 2016; pp. 15–24. [Google Scholar]
  19. Johnston, W.; Judice, P.B.; García, P.M.; Mühlen, J.M.; Skovgaard, E.L.; Stang, J.; Schumann, M.; Cheng, S.; Bloch, W.; Brønd, J.C.; et al. Recommendations for determining the validity of consumer wearable and smartphone step count: Expert statement and checklist of the INTERLIVE network. Br. J. Sports Med. 2021, 55, 780–793. [Google Scholar] [CrossRef] [PubMed]
  20. Johnston, W.; Keogh, A.; Dickson, J.; Leslie, S.J.; Megyesi, P.; Connolly, R.; Burke, D.; Caulfield, B. Human-Centered Design of a Digital Health Tool to Promote Effective Self-care in Patients With Heart Failure: Mixed Methods Study. JMIR Form. Res. 2022, 6, e34257. [Google Scholar] [CrossRef] [PubMed]
  21. Johnston, W.; O’Reilly, M.; Coughlan, G.F.; Caulfield, B. Inertial Sensor Technology Can Capture Changes in Dynamic Balance Control during the Y Balance Test. Digit. Biomark. 2017, 1, 106–117. [Google Scholar] [CrossRef] [PubMed]
  22. Johnston, W.; Patterson, M.; O’Mahony, N.; Caulfield, B. Validation and comparison of shank and lumbar-worn IMUs for step time estimation. Biomed. Eng. Biomed. Tech. 2017, 62, 537–545. [Google Scholar] [CrossRef]
  23. O’Reilly, M.A.; Whelan, D.F.; Ward, T.E.; Delahunt, E.; Caulfield, B. Classification of lunge biomechanics with multiple and individual inertial measurement units. Sports Biomech. 2017, 16, 342–360. [Google Scholar] [CrossRef]
  24. O’Reilly, M.A.; Whelan, D.F.; Ward, T.E.; Delahunt, E.; Caulfield, B.M. Classification of deadlift biomechanics with wearable inertial measurement units. J. Biomech. 2017, 58, 155–161. [Google Scholar] [CrossRef]
  25. Whelan, D.F.; O’Reilly, M.A.; Ward, T.E.; Delahunt, E.; Caulfield, B. Technology in Rehabilitation: Evaluating the Single Leg Squat Exercise with Wearable Inertial Measurement Units. Methods Inf. Med. 2017, 56, 88–94. [Google Scholar] [CrossRef] [PubMed]
  26. Mico-Amigo, M.E.; Bonci, T.; Paraschiv-Ionescu, A.; Ullrich, M.; Kirk, C.; Soltani, A.; Küderle, A.; Gazit, E.; Salis, F.; Alcock, L.; et al. Assessing real-world gait with digital technology? Validation, insights and recommendations from the Mobilise-D consortium. J. Neuroeng. Rehabil. 2023, 20, 78. [Google Scholar] [CrossRef] [PubMed]
  27. Rochester, L.; Mazzà, C.; Mueller, A.; Caulfield, B.; McCarthy, M.; Becker, C.; Miller, R.; Piraino, P.; Viceconti, M.; Dartee, W.P.; et al. A roadmap to inform development, validation and approval of digital mobility outcomes: The Mobilise-D approach. Digit. Biomark. 2020, 4, 13–27. [Google Scholar] [CrossRef]
  28. Doherty, C.; Baldwin, M.; Keogh, A.; Caulfield, B.; Argent, R. Keeping Pace with Wearables: A Living Systematic Umbrella Review of Systematic Reviews Evaluating the Accuracy of Commercial Wearable Technologies in Health Measurement. Sports Med. 2023; Under Review. [Google Scholar]
  29. Muhlen, J.M.; Stang, J.; Skovgaard, E.L.; Judice, P.B.; Molina-Garcia, P.; Johnston, W.; Sardinha, L.B.; Ortega, F.B.; Caulfield, B.; Bloch, W.; et al. Recommendations for determining the validity of consumer wearable heart rate devices: Expert statement and checklist of the INTERLIVE Network. Br. J. Sports Med. 2021, 55, 767–779. [Google Scholar] [CrossRef] [PubMed]
  30. Keogh, A.; Argent, R.; Anderson, A.; Caulfield, B.; Johnston, W. Assessing the usability of wearable devices to measure gait and physical activity in chronic conditions: A systematic review. J. Neuroeng. Rehabil. 2021, 18, 138. [Google Scholar] [CrossRef]
  31. Keogh, A.; Dorn, J.F.; Walsh, L.; Calvo, F.; Caulfield, B. Comparing the Usability and Acceptability of Wearable Sensors Among Older Irish Adults in a Real-World Context: Observational Study. JMIR mHealth uHealth 2020, 8, e15704. [Google Scholar] [CrossRef]
  32. Duignan, C.; Slevin, P.; Caulfield, B.; Blake, C. Mobile athlete self-report measures and the complexities of implementation. J. Sports Sci. Med. 2019, 18, 405–412. [Google Scholar] [PubMed]
  33. Duignan, C.; Slevin, P.; Caulfield, B.; Blake, C. Exploring the use of mobile athlete self-report measures in elite Gaelic games: A qualitative approach. J. Strength Cond. Res. 2021, 35, 3491–3499. [Google Scholar] [CrossRef]
  34. Argent, R.; Slevin, P.; Bevilacqua, A.; Neligan, M.; Daly, A.; Caulfield, B. Clinician perceptions of a prototype wearable exercise biofeedback system for orthopaedic rehabilitation: A qualitative exploration. BMJ Open 2018, 8, e026326. [Google Scholar] [CrossRef]
  35. Keogh, A.; Brennan, C.; Johnston, W.; Dickson, J.; Leslie, S.J.; Burke, D.; Megyesi, P.; Caulfield, B. Six-month pilot testing of a digital health tool to support effective self-care in people with heart failure: A mixed methods study. JMIR Form. Res. 2023, 8, e52442. [Google Scholar] [CrossRef] [PubMed]
  36. Keogh, A.; Johnston, W.; Ashton, M.; Sett, N.; Mullan, R.; Donnelly, S.; Dorn, J.F.; Calvo, F.; Mac Namee, B.; Caulfield, B. “It’s Not as Simple as Just Looking at One Chart”: A Qualitative Study Exploring Clinician’s Opinions on Various Visualisation Strategies to Represent Longitudinal Actigraphy Data. Digit. Biomark. 2020, 4, 87–99. [Google Scholar] [CrossRef] [PubMed]
  37. Keogh, A.; Taraldsen, K.; Caulfield, B.; Vereijken, B. “It’s not about the capture, it’s about what we can learn”: A qualitative study of experts’ opinions and experiences regarding the use of wearable sensors to measure gait and physical activity. J. Neuroeng. Rehabil. 2021, 18, 78. [Google Scholar] [CrossRef] [PubMed]
  38. Keel, S.; Schmid, A.; Keller, F.; Schoeb, V. Investigating the use of digital health tools in physiotherapy: Facilitators and barriers. Physiother. Theory Pract. 2022, 39, 1449–1468. [Google Scholar] [CrossRef] [PubMed]
  39. van Berkel, N.; Luo, C.; Ferreira, D.; Goncalves, J.; Kostakos, V. The Curse of the Quantified Self: An Endless Quest for Answers. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers—UbiComp’15, Osaka, Japan, 7–11 September 2015; pp. 973–978. [Google Scholar]
  40. Kolasa, K.; Kozinski, G. How to Value Digital Health Interventions? A Systematic Literature Review. Int. J. Environ. Res. Public Health 2020, 17, 2119. [Google Scholar] [CrossRef] [PubMed]
  41. Keogh, A.; Mc Ardle, R.; Diaconu, M.G.; Ammour, N.; Arnera, V.; Balzani, F.; Brittain, G.; Buckley, E.; Buttery, S.; Cantu, A.; et al. Mobilizing Patient and Public Involvement in the Development of Real-World Digital Technology Solutions: Tutorial. J. Med. Internet Res. 2023, 25, e44206. [Google Scholar] [CrossRef] [PubMed]
  42. Matwyshyn, A. CYBER! BYU Law Rev. 2017, 6. Available online: https://digitalcommons.law.byu.edu/lawreview/vol2017/iss5/6 (accessed on 8 June 2024).
  43. Allen, A. Proceeding of the American Philosophical Assocation; American Philosophical Association: Newark, Delaware, 2019; Volume 93, pp. 21–38. [Google Scholar]
  44. Fennelly, O.; Voisin, B.; Olszewska, M.; Corrigan, D.; Moriarty, F.; Fahey, T.; Sarkar, D.; Nagy, S.; Wong, S. DASSL “Data Access Sharing Storage & Linkage” Proof-of-Concept: Health and Related Data Linkage in Ireland. Int. J. Popul. Data Sci. 2022, 7, 1908. [Google Scholar] [CrossRef]
  45. Fennelly, O.; Moriarty, F.; Corrigan, D.; Grogan, L.; Wong, S. Proof of Concept: Technical Prototype for Data Access Storage Sharing and Linkage DASSL to Support Research and Innovation in Ireland; Health Research Board: Dublin, Ireland, 2022. [Google Scholar]
  46. Keogh, A.; Alcock, L.; Brown, P.; Buckley, E.; Brozgol, M.; Gazit, E.; Hansen, C.; Scott, K.; Schwickert, L.; Becker, C.; et al. Acceptability of wearable devices for measuring mobility remotely: Observations from the Mobilise-D technical validation study. Digit. Health 2023, 9, 20552076221150745. [Google Scholar] [CrossRef]
  47. Fox, G. “To protect my health or to protect my health privacy?” A mixed-methods investigation of the privacy paradox. J. Assoc. Inf. Sci. Technol. 2020, 71, 1015–1029. [Google Scholar] [CrossRef]
  48. Hirschprung, R.S. Is the Privacy Paradox a Domain-Specific Phenomenon. Computers 2023, 12, 156. [Google Scholar] [CrossRef]
  49. Obar, J.; Oeldorf-Hirsch, A. The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services. Inf. Commun. Soc. 2020, 23, 128–147. [Google Scholar] [CrossRef]
  50. Buschel, I.; Mehdi, R.; Cammilleri, A.; Marzouki, Y.; Elger, B. Protecting human health and security in digital Europe: How to deal with the “privacy paradox”? Sci. Eng. Ethics 2014, 20, 639–658. [Google Scholar] [CrossRef] [PubMed]
  51. Porter, M. What is value in health care? N. Engl. J. Med. 2010, 363, 2477–2481. [Google Scholar] [CrossRef] [PubMed]
  52. Stephens, M.; Mankee-Williams, A. The sickening truth of the digital divide: Digital health reforms and digital inequality. J. Soc. Incl. 2021, 12, 20–29. [Google Scholar]
  53. Eruchalu, C.N.; Pichardo, M.S.; Bharadwaj, M.; Rodriguez, C.B.; Rodriguez, J.A.; Bergmark, R.W.; Bates, D.W.; Ortega, G. The Expanding Digital Divide: Digital Health Access Inequities during the COVID-19 Pandemic in New York City. J. Urban Health 2021, 98, 183–186. [Google Scholar] [CrossRef] [PubMed]
  54. Makri, A. Bridging the digital divide in health care. Lancet Digit. Health 2019, 1, e204–e205. [Google Scholar] [CrossRef]
  55. Watts, G. COVID-19 and the digital divide in the UK. Lancet Digit. Health 2020, 2, e395–e396. [Google Scholar] [CrossRef] [PubMed]
  56. Hadjiat, Y. Healthcare inequity and digital health—A bridge for the divide, or further erosion of the chasm? PLoS Digit. Health 2023, 2, e0000268. [Google Scholar] [CrossRef]
  57. Charles, D.; Boyd, S.; Heckert, L.; Lake, A.; Peterson, K. Effect of payment model on patient outcomes in outpatient physical therapy. J. Allied Health 2018, 47, 72–74. [Google Scholar]
  58. Unsworth, H.; Dillon, B.; Collinson, L.; Powell, H.; Salmon, M.; Oladapo, T.; Ayiku, L.; Shield, G.; Holden, J.; Patel, N.; et al. The NICE Evidence Standards Framework for digital health and care technologies—Developing and maintaining an innovative evidence framework with global impact. Digit. Health 2021, 7, 20552076211018617. [Google Scholar] [CrossRef] [PubMed]
  59. Gensorowsky, D.; Witte, J.; Batram, M.; Greiner, W. Market access and value-based pricing of digital health applications in Germany. Cost Eff. Resour. Alloc. 2022, 20, 25. [Google Scholar] [CrossRef] [PubMed]
  60. Fennelly, O.; Cunningham, C.; Grogan, L.; Cronin, H.; O’Shea, C.; Roche, M.; Lawlor, F.; O’Hare, N. Successfully implementing a national electronic health record: A rapid umbrella review. Int. J. Med. Inform. 2020, 144, 104281. [Google Scholar] [CrossRef] [PubMed]
  61. Fennelly, O.; Grogan, L.; Reed, A.; Hardiker, N.R. Use of standardized terminologies in clinical practice: A scoping review. Int. J. Med. Inform. 2021, 149, 104431. [Google Scholar] [CrossRef]
  62. Lehne, M.; Sass, J.; Essenwanger, A.; Schepers, J.; Thun, S. Why digital medicine depends on interoperability. NPJ Digit. Med. 2019, 2, 79. [Google Scholar] [CrossRef]
  63. Fennelly, O.; Moroney, D.; Doyle, D.; Eustace-Cook, J.; Hughes, M. Interoperability of Patient Portals with Electronic Health Records: A Scopring Review. Int. J. Med. Inform. 2023; under review. [Google Scholar]
Table 1. List of recommendations for digital health researchers for future digital health projects.
Table 1. List of recommendations for digital health researchers for future digital health projects.
RecommendationLesson Linked to
1Leverage real-world data to develop new validation protocols.Validity
2Foster partnerships with companies and research groups looking to use citizen science, real-world validation.
3Encourage public involvement in validation studies.
4Ensure validity at multiple time points has been measured before implementing a tool clinically.
5Engage with existing academic structures that can support the development and integration of PPI into studies from the start. Patient and public involvement
6Adopt a user-centred design process, with PPI contributors as equal partners as standard within studies.
7Actively challenge industry partners/startups about how PPI has been integrated into their solutions, prior to implementing them.
8Cautiously interpret data from commercial devices considering the source and validity of the data.Data privacy
9Counsel participants and patients on data privacy and protection measures.
10Engage in a detailed and systematic evaluation of each device’s security measures prior to implementation.
11Consider reimbursement models during the design process—who will pay for it, how does it fit within current models, or are new models and pathways needed? Cost-effectiveness
12Consider cost-effectiveness before implementing tools in studies.
13Lobby authorities to adapt their reimbursement mechanisms to embrace the opportunity of digital health.
14Consider interoperability early in the design process—what is needed to allow the tool to integrate with existing pathways and other tools?Interoperability
15Invest in backend capabilities early that will allow infrastructure to be scalable.
16Have open APIs within tools and consider their use beyond their own projects from the start.
Table 2. Device evaluation stages based on the work of Keadle et al. and Ash et al. [14,15].
Table 2. Device evaluation stages based on the work of Keadle et al. and Ash et al. [14,15].
Validity Stage
BenchtopLaboratoryFree-LivingImplementation
Aim of stageThe device is evaluated in response to standardised synthetic signals.The device is tested in human participants under controlled conditions; outputs are compared to gold standard criterion measures.The device is tested in human participants in naturalistic and variable (‘free-living’) conditions; outputs are compared to field-based or practical criterion measures.The device is utilised in a healthcare research setting, where its performance, usability, and impact on patient outcomes are evaluated.
Example process for stage (based on accelerometer to measure step counts) Attach accelerometer to calibrated shaker plate and compare its outputs to the expected accelerations. Participants undergo a standardised walking test wearing the device, and the results are compared with gold standard tests (i.e., motion capture cameras).Participants wear the device during daily activities, and device-measured step count is compared with another validated device.The device is used in a clinical trial to monitor patient step counts remotely. Its ability to accurately capture data, its ease of use for patients and staff, and its impact on patient outcomes are assessed.
Table 3. Steps for researchers to consider when evaluating privacy and security concerns with digital health technologies.
Table 3. Steps for researchers to consider when evaluating privacy and security concerns with digital health technologies.
Steps and Questions to Consider
1Does the company have a privacy policy that clearly outlines how they collect, use, and store personal data?
2Are there controls in place to prevent unauthorised access to personal data, such as strong passwords and secure login procedures?
3Does the device have physical security measures in place, such as a secure enclosure or tamper-resistant hardware?
4Is personal data encrypted when it is transmitted or stored on the device or on the company’s servers?
5Does the company have a process in place for responding to data breaches or other security incidents?
6Can users opt out of data collection or delete their personal data if they choose to do so?
7Can users control the data that is collected and shared by the device, such as by adjusting privacy settings or disabling certain features?
8Are there clear terms of service that explain how personal data may be used, including any third-party data sharing?
9Are there physical security measures in place to protect personal data, such as secure servers and data centres?
10Is the company transparent about any third-party data sharing or data analytics that may be conducted with personal data?
11Does the company have clear processes for obtaining informed consent from users before collecting or using their personal data?
12Is the company compliant with relevant privacy laws and regulations, such as the General Data Protection Regulation (GDPR) in the European Union and the California Consumer Privacy Act (CCPA) in the United States?
13Who has control over the data that is generated using a digital health tool? Are there adequate controller–processor agreements in place if required by law?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Keogh, A.; Argent, R.; Doherty, C.; Duignan, C.; Fennelly, O.; Purcell, C.; Johnston, W.; Caulfield, B. Breaking down the Digital Fortress: The Unseen Challenges in Healthcare Technology—Lessons Learned from 10 Years of Research. Sensors 2024, 24, 3780. https://doi.org/10.3390/s24123780

AMA Style

Keogh A, Argent R, Doherty C, Duignan C, Fennelly O, Purcell C, Johnston W, Caulfield B. Breaking down the Digital Fortress: The Unseen Challenges in Healthcare Technology—Lessons Learned from 10 Years of Research. Sensors. 2024; 24(12):3780. https://doi.org/10.3390/s24123780

Chicago/Turabian Style

Keogh, Alison, Rob Argent, Cailbhe Doherty, Ciara Duignan, Orna Fennelly, Ciaran Purcell, William Johnston, and Brian Caulfield. 2024. "Breaking down the Digital Fortress: The Unseen Challenges in Healthcare Technology—Lessons Learned from 10 Years of Research" Sensors 24, no. 12: 3780. https://doi.org/10.3390/s24123780

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop