Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (173)

Search Parameters:
Keywords = prevention bundles

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 1224 KB  
Review
Securing the Achilles’ Heel of Esophagectomy: An Updated Evidence-Based Roadmap for Anastomotic Leak Prevention
by Lorenzo Viggiani d’Avalos, Marcel A. Schneider, Diana Vetter, Pascal Burri, Daniel Gerö and Christian A. Gutschow
Cancers 2026, 18(8), 1294; https://doi.org/10.3390/cancers18081294 - 19 Apr 2026
Viewed by 235
Abstract
Background: Esophagectomy remains the definitive curative treatment for esophageal cancer but is historically burdened by significant procedure-related morbidity. Anastomotic leakage (AL) is still the “Achilles’ heel” of esophageal surgery, serving as a primary benchmark for surgical quality due to its profound impact [...] Read more.
Background: Esophagectomy remains the definitive curative treatment for esophageal cancer but is historically burdened by significant procedure-related morbidity. Anastomotic leakage (AL) is still the “Achilles’ heel” of esophageal surgery, serving as a primary benchmark for surgical quality due to its profound impact on patient recovery, healthcare costs, and long-term oncological outcomes. While surgical expertise and perioperative care have matured, reported AL rates remain persistently high. This necessitates a shift in focus from purely technical modifications toward integrated, data-driven preventive strategies. Purpose: Five years after our initial review, this update synthesizes the rapid evolution in AL prevention. We evaluate the transition from empirical surgical pragmatism to evidence-based protocols, integrating recent breakthroughs in real-time perfusion monitoring, prophylactic endoluminal technologies, and multidisciplinary patient optimization. This work provides a contemporary “roadmap” for navigating the complexities of esophageal reconstruction. Conclusions: The prevention of AL has evolved into a multimodal “bundle” that begins well before the index operation. This review highlights the critical shift toward quantitative perfusion assessment via indocyanine green fluorescence angiography, which is increasingly replacing subjective visual inspection as the standard for anastomotic site selection. We discuss the emerging role of gastric ischemic preconditioning as a biological strategy to enhance conduit vascularity, alongside the paradigm of proactive management using preemptive endoluminal vacuum therapy to mitigate septic sequelae in high-risk cases. Furthermore, we examine technical refinements in conduit construction and conditioning—focusing on the ‘tension-perfusion’ relationship—and the essential role of structured prehabilitation within enhanced recovery after surgery frameworks. While the quality of evidence remains heterogeneous, the move toward standardized reporting and objective monitoring marks a new era of precision in esophageal surgery. Full article
Show Figures

Figure 1

20 pages, 862 KB  
Review
Predicting Sudden Cardiac Death in Heart Failure with Mildly Reduced/Preserved Left Ventricular Ejection Fraction: A Clinical Review
by Mauro Feola, Federico Landra, Cosimo Angelo Greco, Roberto Lorusso and Gaetano Ruocco
J. Clin. Med. 2026, 15(8), 3041; https://doi.org/10.3390/jcm15083041 - 16 Apr 2026
Viewed by 382
Abstract
Cardiac arrest is a way of demise of patients who are affected by heart failure (HF), being more frequent in those with HF with a reduced left ventricular ejection fraction (HFrEF), and is, as such, responsible for 30–50% of cardiac death. Specific data [...] Read more.
Cardiac arrest is a way of demise of patients who are affected by heart failure (HF), being more frequent in those with HF with a reduced left ventricular ejection fraction (HFrEF), and is, as such, responsible for 30–50% of cardiac death. Specific data on the risk of sudden cardiac death (SCD) related to HF with a preserved ejection fraction (HFpEF) and HF with a mildly reduced ejection fraction (HFmrEF) are lacking, as well as data regarding ventricular arrhythmias in this population. Considering the 0.3% person/year incidence rate of investigator-reported ventricular tachycardia (VT) and ventricular fibrillation (VF), the rate of SCD in the analyzed population seems to be 1.3% per year. Age, gender, history of diabetes and myocardial infarction, left bundle branch block (LBBB) on electrocardiogram (ECG), and a natural logarithm of N-terminal pro B-type natriuretic peptide (NT-proBNP), identified a subgroup of HFpEF patients with a higher risk (5-year cumulative incidence of 11%) of sudden death (SD). In HFpEF patients, both glifozins and finerenone did not demonstrate a beneficial effect on SCD incidence in comparison to placebo. A significantly lower rate of SCD emerged in patients who were treated with dapaglifozin (10 vs. 26 pts) among patients with HF with an improved ejection fraction (HFimpEF), who were defined as patients with a previous left ventricular ejection fraction (LVEF) < 40%. Promising methods discussed include cardiac magnetic resonance, myocardial scintigraphy, genetic assessment, and electrophysiologic studies for predicting SCD in those patients. In conclusion, arrhythmic SCD in HFpEF patients should not be considered merely as an effect of VT/VF; bradyarrhythmia is probably more frequent and dangerous. The effects of drugs in preventing SCD in HFpEF have not been demonstrated yet. Full article
(This article belongs to the Special Issue Clinical Challenges in Heart Failure Management: 2nd Edition)
Show Figures

Graphical abstract

28 pages, 769 KB  
Review
Neurological Complications in Intensive Care Units: From Delirium to Long-Term Cognitive Dysfunction—A Narrative Review
by Mateusz Szczupak, Jacek Kobak, Jolanta Wierzchowska, Amelia Dąbrowska, Wioletta Mędrzycka-Dąbrowska and Sabina Krupa-Nurcek
J. Clin. Med. 2026, 15(7), 2478; https://doi.org/10.3390/jcm15072478 - 24 Mar 2026
Viewed by 787
Abstract
Background/Objective: Advances in intensive care medicine have substantially improved the survival of critically ill patients; however, they have also revealed the growing burden of neurological complications that affect both short-term outcomes and long-term functioning. Neurological complications in the intensive care unit (ICU) include [...] Read more.
Background/Objective: Advances in intensive care medicine have substantially improved the survival of critically ill patients; however, they have also revealed the growing burden of neurological complications that affect both short-term outcomes and long-term functioning. Neurological complications in the intensive care unit (ICU) include a wide spectrum of disorders, ranging from acute brain dysfunction such as delirium, coma, and encephalopathy to persistent cognitive impairment after discharge, which represents a key component of Post-Intensive Care Syndrome (PICS). Delirium affects approximately one-third of ICU patients and is independently associated with increased mortality, prolonged hospitalization, and worse long-term neurocognitive outcomes. Due to the limited effectiveness of pharmacological therapies, current clinical approaches emphasize prevention, early diagnosis, and non-pharmacological strategies in line with PADIS guidelines. This narrative review aims to provide a clinically relevant synthesis of neurological complications in adult ICU patients, conceptualized as a continuum from acute brain dysfunction to long-term cognitive impairment. Methods: A narrative review of the literature was conducted, focusing on studies addressing epidemiology, pathophysiology, risk factors, diagnostic strategies, and prevention of neurological complications in critically ill adults. Attention was given to delirium, ICU-acquired cognitive impairment, and their association with PICS, as well as to current guideline-based and non-pharmacological interventions. Results: Available evidence indicates that neurological complications in the ICU are multifactorial and result from the interaction between patient vulnerability, severity of illness, systemic inflammation, sedative exposure, and environmental factors. Delirium remains the most common manifestation of acute brain dysfunction and is strongly associated with adverse outcomes. Increasing evidence supports the effectiveness of structured screening, early mobilization, sleep optimization, and multidisciplinary care bundles in reducing delirium incidence and duration. Moreover, growing attention is directed toward post-ICU follow-up and rehabilitation to reduce long-term cognitive decline. Conclusions: Neurological complications should be considered a central component of critical illness and a continuum extending beyond ICU discharge. Early identification of high-risk patients, implementation of preventive strategies, and integration of acute and post-ICU care are essential to improve survival and long-term cognitive outcomes. Further research should focus on personalized preventive and neuroprotective approaches in critically ill patients. Full article
(This article belongs to the Section Intensive Care)
Show Figures

Figure 1

19 pages, 4435 KB  
Review
DNA Fragmentation Analysis in Human Sperm—Technical Instructions to Prevent False Positives and Negatives in Angle-Modulated Two-Dimensional Single-Cell Pulsed-Field Gel Electrophoresis
by Satoru Kaneko, Yukako Kuroda and Yuki Okada
Genes 2026, 17(3), 319; https://doi.org/10.3390/genes17030319 - 16 Mar 2026
Viewed by 475
Abstract
Over the past two decades, numerous studies have examined the etiological significance of DNA fragmentation in human sperm using methods such as the comet assay (CA), the sperm chromatin structure assay, the sperm chromatin dispersion assay, and the TUNEL assay. We developed single-cell [...] Read more.
Over the past two decades, numerous studies have examined the etiological significance of DNA fragmentation in human sperm using methods such as the comet assay (CA), the sperm chromatin structure assay, the sperm chromatin dispersion assay, and the TUNEL assay. We developed single-cell pulsed-field gel electrophoresis techniques, including one-dimensional (1D-SCPFGE) and angle-modulated two-dimensional (2D-SCPFGE), to detect early signs of naturally occurring DNA fragmentation. Comparative studies using purified human sperm with and without DNA fragmentation revealed some technical limitations in the conventional methods. This technical review outlines the procedures to ensure the quantitative performance of SCPFGE: (1) The mass of naked DNA was prepared through simultaneous in-gel swelling and proteolysis, which are highly sensitive to chemical and physical factors. Notably, these processes are vulnerable to reactive oxygen species (ROS). We developed the anti-ROS SCPFGE system to prevent artifactual cleavages. (2) 1D-SCPFGE discharges long-chain fibers from the origin, separating fibrous and granular segments beyond the tips of the fibers. (3) During continuous electrophoresis after 150° rotation (2D-SCPFGE-0-150), long-chain fibers unexpectedly extended diagonally backward from the origin, with long fibrous segments pulled out from a bundle that extended during the first electrophoresis, indicating some fibrous segments were embedded within the long-chain fibers. Even when SCPFGE was employed, one-directional current led to false negatives. (4) 2D-SCPFGE with angle rotation is currently the most sensitive imaging method for single-nuclear DNA fibers. However, without knowing the size of DNA fragments, it remains a semi-quantitative analysis. (5) To prevent artifactual DNA cleavage caused by ice crystals, low-temperature liquid storage is recommended. (6) The in-gel proteolyzed naked DNA is suitable as a substrate for chemical and enzymatic DNA cleavage analyses. Full article
Show Figures

Figure 1

14 pages, 259 KB  
Article
Correlates of Integrated Human Papillomavirus Vaccination and Cervical Cancer Screening Protection in U.S. Low-Income Women
by Erika B. Biederman, Victoria L. Champion, Katharine J. Head, Teresa M. Imburgia and Gregory D. Zimet
Vaccines 2026, 14(3), 251; https://doi.org/10.3390/vaccines14030251 - 9 Mar 2026
Cited by 1 | Viewed by 722
Abstract
Background/Objectives: In the United States, adult human papillomavirus (HPV) vaccination coverage remains low at 20–50%, depending on age, and cervical cancer (CC) screening rates range from 68 to 76%. Few studies have evaluated characteristics of women who are both HPV vaccinated and up [...] Read more.
Background/Objectives: In the United States, adult human papillomavirus (HPV) vaccination coverage remains low at 20–50%, depending on age, and cervical cancer (CC) screening rates range from 68 to 76%. Few studies have evaluated characteristics of women who are both HPV vaccinated and up to date (UTD) with screening as an integrated outcome. The purpose of the present study was to classify women into four prevention categories and examine factors associated with being double protected compared to unprotected. Methods: Data were gathered via an online survey from a sample of low-income women (household income < USD 50,000) provided by a research survey company (n = 719). Women were classified into four categories: vaccinated only, screened only, both vaccinated and screened (double protected), or neither (unprotected). Sociodemographic characteristics, healthcare access, and Health Belief Model constructs were assessed. Multivariable logistic regression compared women who were double protected with those unprotected (n = 274). Results: Most women were UTD with screening only (57.8%), while 15.5% were double protected and 22.6% were unprotected. Younger age (Odds Ratio [OR = 0.93; 95% Confidence Interval [CI]: 0.89, 0.98), having ≥1 medical visit in the past year (OR = 4.16; 95% CI: 1.74, 9.95), higher perceived CC risk (OR = 3.65; 95% CI: 1.41, 9.43), greater perceived benefits of CC screening (OR = 1.96; 95% CI: 1.45, 2.66), and higher HPV knowledge (OR = 1.09; 95% CI: 1.01, 1.17) were associated with higher odds of being double protected. Conclusions: A substantial proportion of low-income women lack comprehensive CC prevention. Integrated, bundled prevention strategies that simultaneously promote HPV vaccination and screening may be important to reduce CC disparities. Full article
17 pages, 381 KB  
Article
Microorganisms and Mortality Factors in Hospitalized Hemodialysis Patients with Catheter-Related Bloodstream Infection and Infective Endocarditis: 7 Years of Experience
by Feyza Bora, Umit Cakmak, Özlem Esra Yıldırım and Funda Sarı
J. Clin. Med. 2026, 15(5), 1815; https://doi.org/10.3390/jcm15051815 - 27 Feb 2026
Viewed by 465
Abstract
Background and Objectives: Catheter-related bloodstream infections (CRBSIs) and infective endocarditis (IE) lead to substantial morbidity, prolonged hospitalizations, and increased mortality. This study aimed to determine the incidence of IE among hospitalized catheter-dependent HD patients with CRBSI and identify risk factors associated with 90-day [...] Read more.
Background and Objectives: Catheter-related bloodstream infections (CRBSIs) and infective endocarditis (IE) lead to substantial morbidity, prolonged hospitalizations, and increased mortality. This study aimed to determine the incidence of IE among hospitalized catheter-dependent HD patients with CRBSI and identify risk factors associated with 90-day all-cause mortality. Materials and Methods: We conducted a retrospective analysis of patients diagnosed with CRBSI. Clinical, microbiological, and accessible echocardiographic data were evaluated. Risk factors for 90-day mortality were analyzed using univariate analysis and multivariable binary logistic regression analysis models. Results: A total of 85 hospitalized catheter-dependent HD patients with CRBSI were included. Gram-positive organisms were the predominant pathogens (70.6%), with Staphylococcus aureus identified in 35.3% (30/85) of all CRBSI cases. Gram-negative bacteria accounted for 29.4% of all CRBSIs. IE was identified in 9.4% (n = 8) of patients diagnosed with CRBSI. Significant differences were observed between the IE and non-IE groups regarding the need for length of hospital stay, vegetation, embolism (p < 0.05). The 90-day all-cause mortality rate was 14.1% (n = 12). Univariate analysis identified that older age and female gender were associated with increased mortality (p < 0.05). In the multivariable binary logistic regression, only age (OR: 1.055, 95% CI: 1.005–1.107, p = 0.029) remained an independent predictor of 90-day mortality. Conclusions: In catheter-dependent HD patients, Staphylococcus aureus is the predominant organism associated with both CRBSI and IE. With an observed IE occurring in 9.4% hospitalized catheter-dependent HD patients with CRBSI, consistent compliance with prevention bundles must be prioritized as a standard of care for catheter management. Full article
(This article belongs to the Section Infectious Diseases)
Show Figures

Graphical abstract

15 pages, 754 KB  
Review
Evidence on Measures for the Prevention of Pressure Injuries in Mechanically Ventilated Patients in Prone Positioning: A Systematic Review
by Simone Amato, Daniele Napolitano, Alessio Lo Cascio, Elena Conoscenti, Angela Lappa, Emilio D’avino, Mirko Masciullo, Antonello Pucci, Valentina De Bartolo, Claudia Torretta, Lucia Mitello, Anna Rita Marucci and Francesco Gravante
Healthcare 2026, 14(4), 443; https://doi.org/10.3390/healthcare14040443 - 10 Feb 2026
Viewed by 1326
Abstract
Background: Therapeutic prone positioning is widely used to improve oxygenation in patients with acute respiratory distress syndrome but is associated with an increased risk of pressure injuries, particularly affecting facial and anterior body regions. Methods: This systematic review was conducted according to PRISMA [...] Read more.
Background: Therapeutic prone positioning is widely used to improve oxygenation in patients with acute respiratory distress syndrome but is associated with an increased risk of pressure injuries, particularly affecting facial and anterior body regions. Methods: This systematic review was conducted according to PRISMA 2020 and Joanna Briggs Institute guidelines and was prospectively registered in PROSPERO (CRD42023442604). PubMed, CINAHL, Web of Science, Scopus, and the Cochrane Library were searched from inception to June 2025, including grey literature. Primary studies involving adult, mechanically ventilated patients undergoing therapeutic prone positioning and evaluating pressure injury prevention strategies were included. Methodological quality was assessed using JBI critical appraisal tools. Owing to clinical and methodological heterogeneity, findings were synthesized using a Synthesis Without Meta-analysis (SWiM) approach. Results: Eight studies with heterogeneous designs were included. Preventive interventions mainly comprised prophylactic dressings, repositioning and support devices, and comprehensive care bundles. Most strategies were associated with a reduction in pressure injury incidence, particularly in facial and anterior anatomical areas. Greater effectiveness was observed when interventions were implemented within structured protocols supported by staff training and multidisciplinary coordination. Conclusions: Preventive strategies appear effective in reducing pressure injuries associated with prone positioning in critically ill patients. The implementation of standardized, bundled prevention protocols may improve patient safety in intensive care settings. Full article
Show Figures

Figure 1

17 pages, 469 KB  
Review
Neurological Complications After Thoracic Endovascular Repair (TEVAR): A Narrative Review of the Incidence, Mechanisms and Strategies for Prevention and Management
by Francesca Miceli, Marta Ascione, Rocco Cangiano, Antonio Marzano, Alessia Di Girolamo, Giovanni Gagliardo, Luca di Marzo and Wassim Mansour
J. Pers. Med. 2026, 16(2), 77; https://doi.org/10.3390/jpm16020077 - 1 Feb 2026
Cited by 1 | Viewed by 993
Abstract
Background: Thoracic endovascular aortic repair (TEVAR) has evolved the management of descending thoracic aortic disease, but neurological complications—particularly spinal cord ischemia (SCI), stroke, and postoperative delirium—remain among the most feared adverse events, adversely affecting survival, quality of life, and functional independence. Objectives [...] Read more.
Background: Thoracic endovascular aortic repair (TEVAR) has evolved the management of descending thoracic aortic disease, but neurological complications—particularly spinal cord ischemia (SCI), stroke, and postoperative delirium—remain among the most feared adverse events, adversely affecting survival, quality of life, and functional independence. Objectives: The aim of this study was to provide a contemporary narrative synthesis (2000–2025) of the incidence, mechanisms, risk factors, prevention, and management of neurological complications after TEVAR, emphasizing how current evidence supports individualized and risk-adapted strategies for prevention and management. Methods: A narrative, non-systematic search (PubMed/MEDLINE, Scopus, Cochrane Library; 2000–2025) was conducted using terms related to TEVAR, SCI, cerebrovascular events, delirium, and cognitive dysfunction. Priority was given to large registries, cohort studies, and systematic reviews in adult TEVAR populations. Results: Perioperative stroke occurs in ~2–6% of TEVAR cases, with higher rates in arch/zone 0–2 procedures and when the left subclavian artery (LSA) is covered without revascularization. SCI incidence ranges from ~2–9%, influenced by aortic extent and urgency; Vascular Quality Initiative data report SCI in 3.7% of procedures, with markedly reduced 1-year survival. Major SCI risk factors include extensive thoracic coverage, prior aortic repair, vertebral or hypogastric occlusion, emergency presentation, low perioperative mean arterial pressure, anemia, and chronic kidney disease. Postoperative delirium occurs in ~13% of TEVAR-treated type B dissections and correlates with longer hospitalization and early complications. Emerging nomograms for SCI and delirium enable individualized risk stratification. Conclusions: Neurological complications after TEVAR remain clinically significant. Contemporary evidence supports personalized prevention—selective cerebrospinal fluid (CSF) drainage, LSA revascularization, staging, neuromonitoring, and tailored hemodynamic targets—guided by anatomical complexity, comorbidities, collateral network integrity, and prior aortic history. Further research should refine prediction tools, standardize definitions, and evaluate individualized neuroprotective bundles. Full article
(This article belongs to the Special Issue Complications in Vascular Surgery: Current Updates and Perspectives)
Show Figures

Figure 1

25 pages, 1012 KB  
Review
Cognitive Impact of Colorectal Cancer Surgery in Elderly Patients: A Narrative Review
by Oswaldo Moraes Filho, Bruno Augusto Alves Martins, Tuane Colles, Romulo Medeiros de Almeida and João Batista de Sousa
Cancers 2026, 18(3), 417; https://doi.org/10.3390/cancers18030417 - 28 Jan 2026
Viewed by 1217
Abstract
Background/Objectives: Postoperative cognitive dysfunction (POCD) represents a significant and potentially preventable complication in elderly patients undergoing colorectal cancer surgery, with reported incidence ranging from 2.8% to 62.2% depending on perioperative management strategies and assessment methods. This narrative review synthesizes current evidence on the [...] Read more.
Background/Objectives: Postoperative cognitive dysfunction (POCD) represents a significant and potentially preventable complication in elderly patients undergoing colorectal cancer surgery, with reported incidence ranging from 2.8% to 62.2% depending on perioperative management strategies and assessment methods. This narrative review synthesizes current evidence on the epidemiology, pathophysiology, risk factors, and prevention strategies for POCD in this vulnerable population. Methods: A comprehensive narrative review was conducted to examine the current literature on POCD in elderly colorectal cancer patients. Evidence was synthesized from published studies addressing epidemiology, assessment tools, risk factors, pathophysiological mechanisms, and prevention strategies, with a particular focus on Enhanced Recovery After Surgery (ERAS) protocols and multicomponent interventions. Results: Advanced age, pre-existing cognitive impairment, frailty, and surgical complexity emerge as key risk factors for POCD. ERAS protocols demonstrate substantial protective effects, reducing POCD incidence from 35% under conventional care to as low as 2.8% in optimized pathways. The pathophysiology involves multifactorial mechanisms, including neuroinflammation, blood–brain barrier disruption, neurotransmitter dysregulation, and oxidative stress, with surgical trauma triggering systemic inflammatory cascades that activate microglial responses within the central nervous system. Evidence-based prevention strategies include preoperative cognitive and frailty screening, minimally invasive surgical techniques, multimodal opioid-sparing analgesia, regional anesthesia, depth-of-anesthesia monitoring, and structured postoperative care bundles adapted from the Hospital Elder Life Program. Conclusions: The integration of comprehensive perioperative cognitive care protocols represents a critical priority as surgical volumes in elderly populations continue to expand globally. Emerging directions include biomarker development for early detection and risk stratification, precision medicine approaches targeting individual vulnerability profiles, and novel therapeutic interventions addressing neuroinflammatory pathways. Standardized assessment tools, multidisciplinary collaboration, and implementation of evidence-based preventive interventions offer substantial promise for preserving cognitive function and improving long-term quality of life in elderly colorectal cancer patients. Full article
(This article belongs to the Special Issue Surgery for Colorectal Cancer)
Show Figures

Figure 1

13 pages, 4195 KB  
Article
Impact of Rear-Hanging String-Cable-Bundle Shading on Performance Parameters of Bifacial Photovoltaic Modules
by Dan Smith, Scott Rand, Peter Hruby, Ben De Fresart, Paul Subzak, Sai Tatapudi, Nijanth Kothandapani and GovindaSamy TamizhMani
Energies 2026, 19(1), 126; https://doi.org/10.3390/en19010126 - 25 Dec 2025
Viewed by 652
Abstract
The 2025 International Technology Roadmap for Photovoltaics (ITRPV) projects that bifacial modules will dominate the photovoltaic (PV) market, reaching roughly 60–80% global share between 2024 and 2035, while monofacial PV modules will steadily decline. Current industry practice is to route the cable bundles [...] Read more.
The 2025 International Technology Roadmap for Photovoltaics (ITRPV) projects that bifacial modules will dominate the photovoltaic (PV) market, reaching roughly 60–80% global share between 2024 and 2035, while monofacial PV modules will steadily decline. Current industry practice is to route the cable bundles along structural members such as main beams or torque tubes, thereby preventing rear-side shading but resulting in two key drawbacks: increased cable length and decreased system reliability due to cable proximity with rotating members and pinch points. Both effects contribute to higher system costs and reduced cable reliability. An alternative method involves suspending cable bundles directly behind the modules using hangers. While this approach mitigates excess length and risk of cable snags, it introduces the possibility of partial rear-side shading, which could possibly cause performance loss and hot-spot formation due to shade-induced electrical mismatch. Experimental evidence indicates that this risk is minimal, as albedo irradiance typically represents only 10–30% of front-side irradiance as reported in the literature and is largely diffuse, thereby limiting the likelihood of significant directional shading. This study evaluates the performance and reliability impacts of hanger-supported cable bundles under varying experimental conditions. Performance metrics assessed include maximum power output (Pmax), short-circuit current (Isc), open-circuit voltage (Voc), and fill factor (FF), while hot-spot risk was evaluated through measurements of module temperature uniformity using infrared imaging. Each cable (1X) was 6 AWG with a total outer diameter of approximately 9 mm. Experiments covered different cable bundle counts/sizes (2X, 6X, 16X), mounting configurations (fixed-tilt and single-axis tracker), and albedo conditions (snow-covered and snow-free ground). Measurements were conducted hourly on clear days between 8:00 and 16:00 from June to September 2025. The results consistently show that hanger-supported cable bundles have a negligible shading impact across all hours of the day and throughout the measurement period. This indicates that rear-side cable shading can be safely and practically disregarded in performance modeling and energy-yield assessments for the tested configurations, including fixed-tilt systems and single-axis trackers with or without torque tube shading and with various hanger sizes and cable-bundle counts. Therefore, hanging cables behind modules is a cost- and reliability-friendly, safe and recommended practice. Full article
(This article belongs to the Section A2: Solar Energy and Photovoltaic Systems)
Show Figures

Figure 1

27 pages, 1531 KB  
Review
Hospital Influenza Outbreak Management in the Post-COVID Era: A Narrative Review of Evolving Practices and Feasibility Considerations
by Wei-Hsuan Huang, Yi-Fang Ho, Jheng-Yi Yeh, Po-Yu Liu and Po-Hsiu Huang
Healthcare 2026, 14(1), 50; https://doi.org/10.3390/healthcare14010050 - 24 Dec 2025
Viewed by 1128
Abstract
Background: Hospital-acquired influenza remains a persistent threat that amplifies morbidity, mortality, length of stay, and operational strain, particularly among older and immunocompromised inpatients. The COVID-19 era reshaped control norms—normalizing N95 use during surges, ventilation improvements, and routine multiplex PCR—creating an opportunity to [...] Read more.
Background: Hospital-acquired influenza remains a persistent threat that amplifies morbidity, mortality, length of stay, and operational strain, particularly among older and immunocompromised inpatients. The COVID-19 era reshaped control norms—normalizing N95 use during surges, ventilation improvements, and routine multiplex PCR—creating an opportunity to strengthen hospital outbreak management. Methods: We conducted a targeted narrative review of WHO/CDC/Infectious Diseases Society of America (IDSA) guidance and peer-reviewed studies (January 2015–August 2025), emphasizing adult inpatient care. This narrative review synthesizes recent evidence and discusses theoretical implications for practice, rather than establishing formal guidelines. Evidence was synthesized into pragmatic practice statements on detection, diagnostics, isolation/cohorting, antivirals, chemoprophylaxis, vaccination, surveillance, and communication. Results: Early recognition and test-based confirmation are pivotal. For inpatients, nucleic-acid amplification tests are preferred; negative antigen tests warrant PCR confirmation, and lower-respiratory specimens improve yield in severe disease. A practical outbreak threshold is ≥2 epidemiologically linked, laboratory-confirmed cases within 72 h on the same ward. Effective control may require immediate isolation or cohorting with dedicated staff, strict droplet/respiratory protection, and daily active surveillance. Early oseltamivir (≤48 h from onset or on admission) reduces mortality and length of stay; short-course post-exposure prophylaxis for exposed patients or staff lowers secondary attack rates. Integrated vaccination efforts for healthcare personnel and high-risk patients reinforce workforce resilience and reduce transmission. Conclusions: A standardized, clinician-led bundle—early molecular testing, do-not-delay antivirals, decisive cohorting and Personal protective equipment (PPE), targeted chemoprophylaxis, vaccination, and disciplined communication— could help curb transmission, protect vulnerable patients and staff, and preserve capacity. Hospitals should codify COVID-era layered controls for seasonal influenza and rehearse unit-level outbreak playbooks to accelerate response and recovery. These recommendations target clinicians and infection-prevention leaders in acute-care hospitals. Full article
Show Figures

Figure 1

27 pages, 870 KB  
Article
Data Quality Improvement Supports Digital Transformation in Industry 5.0
by Jian Wang, Zhuowei Wu and Ting Wang
Sustainability 2025, 17(24), 11183; https://doi.org/10.3390/su172411183 - 13 Dec 2025
Viewed by 1114
Abstract
Data quality is known as the fitting degree of data content and formats to functions and it plays a crucial role in firms’ digital transformation. This study focuses on Industry 5.0, draws on Deming’s Profound Knowledge System on quality, and identifies four key [...] Read more.
Data quality is known as the fitting degree of data content and formats to functions and it plays a crucial role in firms’ digital transformation. This study focuses on Industry 5.0, draws on Deming’s Profound Knowledge System on quality, and identifies four key influencing factors on data quality that align with Industry 5.0 concepts, i.e., data variation, employee resilience, system integration, and digital variation knowledge management. A structural model among these factors was established to support the digital transformation. An empirical study with a 301-participant questionnaire survey was adopted to test the model using SEM. The results show the following: (1) employee resilience and system integration each exert a positive effect on data variation and digital variation knowledge management; (2) data variation and digital variation knowledge management both positively affect digital transformation; and (3) employee resilience mediates system integration’s effects on data variation and digital variation knowledge management. Based on the results, this paper proposes a novel approach to enhancing data quality in digital transformation with a sustainable view: (1) employee resilience and system integration should be bundled, and emphasis should be put on the mediating role of employee resilience, forming a resilient firm capability and (2) digital variation knowledge management safeguards data variation, can prevent and respond to data quality variation risks, and helps firms form a better decision-making capacity. The proposed model can convert resource identification into capabilities generation and then to value creation with the resource orchestration view. It can help firms achieve more sustainable development during digital transformation. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

18 pages, 841 KB  
Review
Cutaneous Adverse Events of Tyrosine Kinase Inhibitors in Endocrine Tumors: Clinical Features, Mechanisms, and Management Strategies
by Marta Marino, Francois Rosset, Alice Nervo, Alessandro Piovesan, Valentina Pala, Elisa Vaccaro, Luca Mastorino, Aldo E. Calogero and Emanuela Arvat
Biomedicines 2025, 13(12), 3044; https://doi.org/10.3390/biomedicines13123044 - 11 Dec 2025
Viewed by 1287
Abstract
Background: Tyrosine kinase inhibitors (TKIs) are crucial to treating endocrine-related malignancies, including advanced thyroid cancers and neuroendocrine tumors, but their benefit is tempered by cutaneous adverse events (CAEs) that impair adherence and quality of life. Objective: To summarize the dermatologic toxicities of TKIs [...] Read more.
Background: Tyrosine kinase inhibitors (TKIs) are crucial to treating endocrine-related malignancies, including advanced thyroid cancers and neuroendocrine tumors, but their benefit is tempered by cutaneous adverse events (CAEs) that impair adherence and quality of life. Objective: To summarize the dermatologic toxicities of TKIs used in endocrine oncology and provide practical, multidisciplinary guidance for prevention and management. Methods: Narrative synthesis of clinical trial reports, post-marketing studies, and specialty guidelines pertinent to lenvatinib, vandetanib, cabozantinib, and other commonly used TKIs, integrating dermatologic and endocrine perspectives on mechanisms and care pathways. Results: VEGFR-targeted TKIs frequently cause hand–foot skin reaction, xerosis, fissuring, paronychia, and impaired wound healing; multikinase inhibition also produces alopecia, pigmentary changes, and mucositis. Epidermal growth factor receptor (EGFR) and rearranged during transfection (RET) inhibition with vandetanib is associated with acneiform eruption, photosensitivity, and nail fragility. Pathogenesis reflects on-target inhibition of VEGF/EGFR signaling leading to keratinocyte dysfunction, vascular fragility, and altered eccrine mechanics. Early risk stratification, patient education, and bundle-based prophylaxis (emollients, keratolytics, urea-based creams, sun protection) reduce incidence and severity. Grade-based algorithms combining topical corticosteroids/antibiotics, dose interruptions or reductions, and short systemic courses (e.g., doxycycline, antihistamines) enable symptom control while maintaining anticancer intensity. Close coordination around procedures minimizes wound-healing complications. Conclusions: Dermatologic toxicities are predictable, mechanism-linked, and manageable with proactive, multidisciplinary care. Standardized prevention and treatment pathways tailored to specific TKIs—particularly lenvatinib, vandetanib, and cabozantinib—can preserve dose intensity, optimize quality of life, and sustain antineoplastic efficacy. Full article
Show Figures

Graphical abstract

12 pages, 847 KB  
Article
Impact of a Three-Strain Lactobacilli Probiotic (BioK+) on Incidence of Hospital-Onset Clostridioides difficile: A Retrospective Observational Cohort Study
by Matthew A. Jenest, Randolph V. Fugit, Jason Wright, Mary T. Bessesen and Shelley E. Kon
Antibiotics 2025, 14(12), 1225; https://doi.org/10.3390/antibiotics14121225 - 4 Dec 2025
Cited by 1 | Viewed by 1457
Abstract
Background: Prevention of hospital-onset Clostridioides difficile infection (HO-CDI) is a priority for hospitals. In addition to standard infection control measures, some probiotics show promise in reducing HO-CDI incidence. However, prior research has produced mixed results. Methods: Retrospective, observational cohort study of HO-CDI incidence [...] Read more.
Background: Prevention of hospital-onset Clostridioides difficile infection (HO-CDI) is a priority for hospitals. In addition to standard infection control measures, some probiotics show promise in reducing HO-CDI incidence. However, prior research has produced mixed results. Methods: Retrospective, observational cohort study of HO-CDI incidence among inpatients treated with or without BioK+ probiotic prophylaxis. BioK+, a probiotic with three Lactobacilli strains, was administered to patients on antibiotics with high risk for HO-CDI. BioK+ was continued for 5 days after antibiotics were discontinued, or the patient was discharged. The primary outcome was HO-CDI incidence. Results: Out of 494 eligible patients on high-risk antibiotics, 343 patients received BioK+ probiotics. No cases of HO-CDI were identified in patients who received BioK+, compared to three cases among patients not on BioK+ (p = 0.028). In the baseline period (1 April 2021–31 March 2022) the HO-CDI incidence density was 5.62 per 10,000 bed-days. In the BioK+ probiotic period (1 April 2022–31 March 2023), the incidence density was 2.22 cases per 10,000 patient days (p = 0.03). Conclusions: When bundled with standard infection control practices, the use of BioK+ probiotics was associated with a statistically significant decreased incidence of HO-CDI among patients prescribed high-risk antibiotics. Full article
Show Figures

Figure 1

11 pages, 590 KB  
Article
A Quality Improvement Bundle to Reduce Central Line-Associated Bloodstream Infections in Neonatal Intensive Care Unit: An Observational Study
by Chiara Poggi, Giulia Fontanelli, Martina Ciarcià, Giovanni Sassudelli, Camilla Fazi, Leonardo Fioravanti, Silvia Grassellini, Monica Piazza and Carlo Dani
Antibiotics 2025, 14(12), 1208; https://doi.org/10.3390/antibiotics14121208 - 1 Dec 2025
Viewed by 1701
Abstract
Background: Dedicated bundles were proven to reduce CLABSIs in a neonatal intensive care unit (NICU). Methods: We performed an observational pre–post study to evaluate the impact of a bundle for CLABSI prevention in our NICU. All umbilical vein catheters (UVCs) and epicutaneo-caval catheters [...] Read more.
Background: Dedicated bundles were proven to reduce CLABSIs in a neonatal intensive care unit (NICU). Methods: We performed an observational pre–post study to evaluate the impact of a bundle for CLABSI prevention in our NICU. All umbilical vein catheters (UVCs) and epicutaneo-caval catheters (ECCs) with dwell time > 2 days were included. The primary outcome was CLABSI rate/1000 central line days. Results: A total of 145 catheters (67 UVCs and 78 ECCs) and 142 catheters (65 UVCs and 77 ECCs) were inserted before and after bundle implementation, respectively. The duration of the UVC was significantly shorter before than after the bundle [4 (3–6) vs. 8 (6–11) days; p < 0.0001], while the duration of the ECC did not differ [10 (6–17) vs. 11 (6–19) days; p = 0.711]. CLABSI were less frequent after than before bundle (3.6 vs. 10.7/1000 CL days; p = 0.042); both UVC-related and ECC-related CLABSI were significantly reduced (0 vs. 7.2/1000 CL days, p = 0.015; and 4.4 vs. 12.3/1000 CL days, p = 0.044, respectively). The Kaplan–Meier curve for ECC-related CLABSIs showed no differences between the two periods (p = 0.255), but higher survival without CLABSIs after vs. before bundle was found if considering only ECC with dwell time < 14 days (p = 0.040). Gestational age (p = 0.004) and bundle (p = 0.026) were predictive factors for CLABSIs. Non-infective complications were significantly less frequent after than before bundle (11 vs. 20%, p = 0.033). Conclusions: Our bundle reduced the overall CLABSI rate, and both UVC- and ECC-related CLABSI occurrence. The efficacy for the reduction in ECC-related CLABSIs seems limited to the first 14 days of dwell time. Full article
Show Figures

Figure 1

Back to TopTop