Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (21)

Search Parameters:
Keywords = unmatched receiving

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 1523 KB  
Article
Dynamic Assessment of Modified EASIX (m-EASIX) at 48 Hours Predicts Adverse Outcomes in Acute Pancreatitis: A Propensity Score-Matched Study
by Hikmet Öztop, Enes Yavuz, Nevriye Gül Ada Tak, Fatih Eren and Fazıl Çağrı Hunutlu
Medicina 2026, 62(3), 568; https://doi.org/10.3390/medicina62030568 - 18 Mar 2026
Viewed by 275
Abstract
Background and Objectives: Early risk stratification in acute pancreatitis (AP) remains challenging, particularly for identifying patients who initially appear low-risk but later develop complications. The Modified Endothelial Activation and Stress Index (m-EASIX) reflects endothelial injury and systemic inflammation. This study evaluated the [...] Read more.
Background and Objectives: Early risk stratification in acute pancreatitis (AP) remains challenging, particularly for identifying patients who initially appear low-risk but later develop complications. The Modified Endothelial Activation and Stress Index (m-EASIX) reflects endothelial injury and systemic inflammation. This study evaluated the prognostic value of dynamic 48 h m-EASIX assessment for predicting adverse clinical outcomes in AP. Materials and Methods: This retrospective cohort study included adult patients hospitalized with AP between January 2020 and June 2025. Propensity score matching (1:1) was performed using age, sex, BISAP score and etiology. Laboratory parameters were recorded at admission and at 48 h. Adverse outcomes were defined as prolonged hospitalization (≥8 days) and/or pancreatic necrosis, abscess, intensive care unit admission or in-hospital mortality. Multivariable logistic regression was used to identify independent predictors of adverse outcomes. Receiver operating characteristic (ROC) analysis evaluated the predictive performance of m-EASIX and compared it with BISAP and Ranson scores. Results: A total of 258 patients were included in the initial cohort, of whom 93 experienced an adverse clinical course. After propensity score matching, 170 patients remained in the final analysis (85 per group). The 48 h m-EASIX score was independently associated with adverse outcomes in both unmatched and matched cohorts. ROC analysis showed a moderate discrimination for composite outcomes (AUC ≈ 0.76) and a stronger discrimination for hard outcomes (AUC up to 0.867). In all analyses, m-EASIX significantly outperformed BISAP and Ranson scores (DeLong test p < 0.001). Dynamic risk reclassification showed that m-EASIX identified a subgroup of patients initially classified as low-risk by BISAP who later developed adverse outcomes. Conclusions: The dynamic assessment of m-EASIX at 48 h provides additional prognostic information for early risk stratification in AP and may help identify patients at an increased risk of unfavorable clinical courses. Full article
(This article belongs to the Special Issue Acute Pancreatitis: From Pathogenesis to Treatment)
Show Figures

Figure 1

14 pages, 1691 KB  
Article
Clostridium difficile Colonization in Oncologic Patients Undergoing Major Abdominopelvic Surgery: To Treat or Not to Treat? An Observational Study with Propensity Score Analysis
by Sorinel Lunca, Wee Liam Ong, Raluca Zaharia, Romulus Mihaita Pruna, Gabriel Mihail Dimofte and Stefan Morarasu
Medicina 2025, 61(9), 1606; https://doi.org/10.3390/medicina61091606 - 5 Sep 2025
Viewed by 963
Abstract
Background: Clostridium difficile colonization (CDC) represents a clinical concern in oncology patients undergoing abdominopelvic surgery, particularly regarding the potential role of prophylactic antibiotics in preventing progression to active infection. Methods: We performed a single-center, retrospective, case-matched observational study of oncology patients with CDC [...] Read more.
Background: Clostridium difficile colonization (CDC) represents a clinical concern in oncology patients undergoing abdominopelvic surgery, particularly regarding the potential role of prophylactic antibiotics in preventing progression to active infection. Methods: We performed a single-center, retrospective, case-matched observational study of oncology patients with CDC who underwent abdominopelvic surgery between 2018 and 2023. Patients were divided into two cohorts: those who received prophylactic antibiotics and those who did not. Postoperative outcomes were compared using propensity score matching (PSM). Logistic regression and ROC curve analysis were applied to assess the predictive value of antibiotics relative to other comorbidities. Results: Ninety patients were included (62 with antibiotics; 28 without). In the unmatched cohort, patients receiving antibiotics showed a non-significant trend toward higher morbidity (32.2% vs. 21.4%, p = 0.327) and surgical site infection rates (9.6% vs. 0%, p = 0.171). After PSM (26 patients per group), morbidity remained comparable between cohorts (30.7% vs. 23.0%, p = 0.538). Notably, no patient developed active C. difficile infection during follow-up, regardless of antibiotic use. Antibiotic therapy was not an independent predictor of postoperative morbidity (OR 1.746, p = 0.297; AUC 0.549, 95% CI 0.405–0.687). Conclusions: In this study, prophylactic antibiotic use in CDC patients undergoing abdominopelvic oncology surgery was not associated with improved postoperative outcomes. While no progression to active infection was observed, the potential benefits of prophylaxis remain uncertain. Larger, prospective studies are needed to clarify the clinical role of antibiotics in this setting. Full article
(This article belongs to the Section Oncology)
Show Figures

Figure 1

18 pages, 1158 KB  
Article
Ten-Year Trend in the Potentially Inappropriate Prescribing of Renally-Dependent Medicines in Australian General Practice Patients with Dementia
by Saad Alhumaid, Woldesellassie M. Bezabhe, Mackenzie Williams and Gregory M. Peterson
J. Clin. Med. 2025, 14(13), 4734; https://doi.org/10.3390/jcm14134734 - 4 Jul 2025
Viewed by 1232
Abstract
Background: There is limited published evidence on the prevalence of potentially inappropriate prescribing of medicines in relation to kidney function in older Australians, particularly those with dementia. Objectives: To examine the prevalence, temporal trends and factors associated with potentially inappropriate prescribing of renally-dependent [...] Read more.
Background: There is limited published evidence on the prevalence of potentially inappropriate prescribing of medicines in relation to kidney function in older Australians, particularly those with dementia. Objectives: To examine the prevalence, temporal trends and factors associated with potentially inappropriate prescribing of renally-dependent medicines in patients with dementia, using Australian general practice data. Methods: This comparative study was reported in accordance with the STROBE guidelines for cohort studies. Retrospective analyses of the National Prescribing Service (NPS) MedicineInsight dataset were performed to determine the proportion of patients aged ≥ 65 years with a recorded diagnosis of dementia, along with matched controls, who had potentially inappropriate prescribing based on their estimated glomerular filtration rate (eGFR) during the study period (2011–2020). Each patient was included only once throughout the study. Potentially inappropriate prescribing was evaluated for 33 commonly used medicines, using the Cockcroft-Gault equation for estimated creatinine clearance or eGFR, in accordance with the guidelines from the Australian Medicines Handbook (AMH). Each patient’s medicines were included if they were prescribed within 180 days after the most recent recorded lowest eGFR value for the patient. Medicines having prescribed doses exceeding those recommended for an individual’s renal function were classified as ‘inappropriate dosage’, while those whose use was advised against were labelled ‘contraindicated’. Both categories were regarded as inappropriate prescriptions. Descriptive statistics were used to summarise patient characteristics and medication use. Temporal trends were displayed in graphs, with statistical significance determined using the Cochran-Armitage test. Binary logistic regression models were used to examine the associations between sociodemographic and clinical factors and the prescribing of medicines inconsistent with AMH guidelines. Results: The unmatched cohorts included 33,101 patients, comprising 4092 with dementia and 29,009 without. Among them, 58.4% were female, and the overall median age was 82 years [interquartile range (IQR): 77–87]. After propensity score matching, there were 4041 patients with dementia and 8031 without dementia. Over the study period, potentially inappropriate prescribing increased slightly, but insignificantly, in both groups of patients; the prevalence of inappropriate use of at least one of the 33 drugs of interest rose from 6.5% (95% CI 4.5–9.1%) in 2011 to 8.9% (95% CI 6.0–12.7%; p for trend: 0.966) in 2020 in the dementia group, and 9.2% (95% CI 8.0–10.5%) to 11.1% (95% CI 10.3–12.0%; p for trend: 0.224) in the matched controls. Over the ten-year period, approximately 9.3% (377) of patients with dementia in the matched cohort received at least one potentially inappropriate prescription. Among these, 154 (40.8%) were for contraindicated medicines, and 223 (59.1%) were for inappropriate doses based on renal function. Among patients with dementia in the matched cohort, fenofibrate, nitrofurantoin, and moxonidine were the most frequently prescribed medicines at doses inconsistent with AMH guidelines. In the unmatched dementia cohort, potentially inappropriate prescribing was not significantly associated with demographic characteristics or most comorbidities; however, it occurred more frequently in patients with an eGFR below 30 mL/min/1.73 m2 or those with concomitant diabetes. Conclusions: Positively, the prevalence of potentially inappropriate prescribing of renally-dependent medicines in primary care patients with dementia in Australia was similar to their matched controls. However, there was room for improvement in the prescribing of these drugs in both patients with and without dementia. Full article
(This article belongs to the Special Issue Clinical Epidemiology in Chronic Kidney Disease)
Show Figures

Figure 1

15 pages, 241 KB  
Article
Gender-Specific Outcomes in TAVI with Self-Expandable Valves: Insights from a Large Real-World Registry
by Alessandro Sticchi, Dario Grassini, Francesco Gallo, Stefano Benenati, Won-Keun Kim, Arif A. Khokhar, Tobias Zeus, Stefan Toggweiler, Roberto Galea, Federico De Marco, Antonio Mangieri, Damiano Regazzoli, Bernhard Reimers, Luis Nombela-Franco, Marco Barbanti, Ander Regueiro, Tommaso Piva, Josep Rodés-Cabau, Italo Porto, Antonio Colombo and Francesco Gianniniadd Show full author list remove Hide full author list
J. Clin. Med. 2025, 14(9), 3144; https://doi.org/10.3390/jcm14093144 - 1 May 2025
Cited by 3 | Viewed by 1593
Abstract
Background/Objectives: Aortic stenosis (AS) is the most prevalent valvular heart disease in developed countries and imposes an increasing burden on aging populations. Although transcatheter aortic valve implantation (TAVI) has transformed the treatment of severe AS, current guidelines do not differentiate management based [...] Read more.
Background/Objectives: Aortic stenosis (AS) is the most prevalent valvular heart disease in developed countries and imposes an increasing burden on aging populations. Although transcatheter aortic valve implantation (TAVI) has transformed the treatment of severe AS, current guidelines do not differentiate management based on gender. This study aimed to investigate gender-based differences in procedural complications and one-year clinical outcomes in patients treated with next-generation self-expandable TAVI devices. Methods: This retrospective, multicenter international registry included 3862 consecutive patients who received either the ACURATE neo or Evolut R/Pro valve. Patients were stratified by gender; propensity score matching (PSM) adjusted for baseline differences. The primary endpoint was a composite of all-cause mortality or stroke at one year. Secondary endpoints included major vascular complications, major or life-threatening bleeding and acute kidney injury (AKI). Results: Of 3353 patients included (64.5% female), women were older (82.3 ± 5.6 vs. 81.1 ± 6.2 years, p < 0.001) and had higher STS scores (5.2 ± 3.9 vs. 4.5 ± 3.4%, p < 0.001). In the unmatched population, major vascular complications occurred in 7.7% of females versus 4.1% of males (p < 0.001), life-threatening bleeding in 2.8% vs. 1.4% (p = 0.016) and AKI in 8.5% vs. 5.7% (p = 0.009). After PSM, the primary endpoint was more frequent in females (9.4% vs. 6.0%, p = 0.014), largely driven by stroke (2.8% vs. 1.2%, p = 0.024), while overall mortality was similar (11.3% vs. 9.5%, p = 0.264). Conclusions: Despite comparable long-term survival, female patients undergoing TAVI with self-expandable valves experience higher rates of procedural complications, notably stroke and major vascular events. These findings underscore the need for tailored procedural strategies to improve outcomes in female patients. Full article
12 pages, 502 KB  
Article
Unsuppressed HIV Viral Load and Related Factors in Patients Receiving Antiretroviral Treatment in Tanganyika Province, Democratic Republic of Congo (DRC)
by Michel Luhembwe, Richard Ingwe, Aimée Lulebo, Dalau Nkamba and John Ditekemena
BioMed 2024, 4(3), 338-349; https://doi.org/10.3390/biomed4030027 - 18 Sep 2024
Cited by 1 | Viewed by 3702
Abstract
Antiretroviral treatment (ART) has revolutionized the management of the human immunodeficiency virus (HIV) and acquired immunodeficiency syndrome (AIDS), enabling long-term viral load (VL) suppression in patients. Despite the proven effectiveness of ART, a significant proportion of patients with HIV receiving ART fail to [...] Read more.
Antiretroviral treatment (ART) has revolutionized the management of the human immunodeficiency virus (HIV) and acquired immunodeficiency syndrome (AIDS), enabling long-term viral load (VL) suppression in patients. Despite the proven effectiveness of ART, a significant proportion of patients with HIV receiving ART fail to achieve viral load suppression (VLS). This study aimed to identify factors associated with low VLS in the Tanganyika province. An unmatched case–control study was conducted from January 2022 to June 2023, including 22 care facilities with viral load data. Data were collected from patient records. For each reviewed record, the patient was invited for an interview upon providing informed consent. Data were analyzed using SPSS version 27. In a multivariable binary logistic regression model, variables with a p-value < 0.05 and a 95% confidence interval for the adjusted odds ratio were considered significantly associated with unsuppressed VL. A total of 462 individuals, including 156 cases and 306 controls, were included in the study. The mean age (standard deviation) of participants was 42.12 (±11.6) years. The following covariates were significantly associated with unsuppressed VL: poor HIV status disclosure to a confidant [adjusted OR = 2.10, 95% CI (1.33–3.31), p = 0.001], poor ART adherence [adjusted OR = 2.01, 95% CI (1.25–3.23), p = 0.004], ART interruption [adjusted OR = 3.43, 95% CI (2.00–5.88), p < 0.001], no participation in support groups [adjusted OR = 2.16, 95% CI (1.25–3.71), p = 0.005], baseline WHO clinical stage 3 and 4 [adjusted OR = 2.24, 95% CI (1.32–3.79), p = 0.003], opportunistic infections (OIs) [adjusted OR = 2.30, 95% CI (1.27–4.16), p = 0.006], and non-communicable chronic diseases (NCDs) [adjusted OR = 2.30, 95% CI (1.10–4.79), p = 0.026]. Given the clear association between several factors and unsuppressed VL, prevention should involve the implementation of innovative strategies targeting at-risk patient groups. Strengthening the monitoring of these factors among active patients at each appointment is recommended to achieve this goal. Full article
Show Figures

Figure 1

13 pages, 1655 KB  
Article
Two Decades of CABG in the UK: A Propensity Matched Analysis of Outcomes by Conduit Choice
by Georgia R. Layton, Shubhra Sinha, Massimo Caputo, Gianni D. Angelini, Daniel P. Fudulu and Mustafa Zakkar
J. Clin. Med. 2024, 13(16), 4717; https://doi.org/10.3390/jcm13164717 - 12 Aug 2024
Cited by 4 | Viewed by 1742
Abstract
Background/Objectives: Grafting of LIMA to LAD has long been considered the gold-standard conduit choice for patients undergoing CABG. Despite this, the LSV remains the most used conduit by volume and some patients may not receive even a single arterial conduit. However, the [...] Read more.
Background/Objectives: Grafting of LIMA to LAD has long been considered the gold-standard conduit choice for patients undergoing CABG. Despite this, the LSV remains the most used conduit by volume and some patients may not receive even a single arterial conduit. However, the outcomes in this group are not frequently explored. This study, therefore, compares in-hospital outcomes of patients who underwent CABG without any arterial conduits to those who received at least one arterial conduit. Methods: Retrospective propensity-matched database analysis of consecutive patients undergoing CABG in the UK between 1996 and 2019 using data from the National Adult Cardiac Surgery Audit. Results: 335,144 patients underwent CABG, with 6% receiving venous conduits only; matched outcomes are reported for 39,812 patients. In both unmatched and matched groups, we found a significant increase in mortality with the use of veins only (matched mortality 5.3% vs. 3.8%, p < 0.001) with estimated treatment effect for mortality OR 1.43, p < 0.001 (95% CI: 1.31–1.57). We also identified greater rates of post-operative dialysis, IABP insertion, and length of hospital stay in this group. Conclusions: We identified a significant increase in in-hospital mortality with the use of veins only compared to using at least one arterial graft to the LAD. While a single arterial graft should be prioritised wherever possible, venous revascularisation retains a critical role for specific patients. We must, therefore, continue to conduct research addressing the mechanisms underlying and propagating vein graft disease in order better to optimise outcomes for this niche patient group after CABG. Full article
(This article belongs to the Section Cardiology)
Show Figures

Figure 1

10 pages, 889 KB  
Article
Early Repair of Rib Fractures Is Associated with Superior Length of Stay and Total Hospital Cost: A Propensity Matched Analysis of the National Inpatient Sample
by Christopher W. Towe, Katelynn C. Bachman, Vanessa P. Ho, Fredric Pieracci, Stephanie G. Worrell, Matthew L. Moorman, Philip A. Linden and Avanti Badrinathan
Medicina 2024, 60(1), 153; https://doi.org/10.3390/medicina60010153 - 14 Jan 2024
Cited by 5 | Viewed by 3202
Abstract
Background and Objectives: Previous studies have suggested that early scheduling of the surgical stabilization of rib fractures (SSRF) is associated with superior outcomes. It is unclear if these data are reproducible at other institutions. We hypothesized that early SSRF would be associated with [...] Read more.
Background and Objectives: Previous studies have suggested that early scheduling of the surgical stabilization of rib fractures (SSRF) is associated with superior outcomes. It is unclear if these data are reproducible at other institutions. We hypothesized that early SSRF would be associated with decreased morbidity, length of stay, and total charges. Materials and Methods: Adult patients who underwent SSRF for multiple rib fractures or flail chest were identified in the National Inpatient Sample (NIS) by ICD-10 code from the fourth quarter of 2015 to 2016. Patients were excluded for traumatic brain injury and missing study variables. Procedures occurring after hospital day 10 were excluded to remove possible confounding. Early fixation was defined as procedures which occurred on hospital day 0 or 1, and late fixation was defined as procedures which occurred on hospital days 2 through 10. The primary outcome was a composite outcome of death, pneumonia, tracheostomy, or discharge to a short-term hospital, as determined by NIS coding. Secondary outcomes were length of hospitalization (LOS) and total cost. Chi-square and Wilcoxon rank-sum testing were performed to determine differences in outcomes between the groups. One-to-one propensity matching was performed using covariates known to affect the outcome of rib fractures. Stuart–Maxwell marginal homogeneity and Wilcoxon signed rank matched pair testing was performed on the propensity-matched cohort. Results: Of the 474 patients who met the inclusion criteria, 148 (31.2%) received early repair and 326 (68.8%) received late repair. In unmatched analysis, the composite adverse outcome was lower among early fixation (16.2% vs. 40.2%, p < 0.001), total hospital cost was less (USD114k vs. USD215k, p < 0.001), and length of stay was shorter (6 days vs. 12 days) among early SSRF patients. Propensity matching identified 131 matched pairs of early and late SSRF. Composite adverse outcomes were less common among early SSRF (18.3% vs. 32.8%, p = 0.011). The LOS was shorter among early SSRF (6 days vs. 10 days, p < 0.001), and total hospital cost was also lower among early SSRF patients (USD118k vs. USD183k late, p = 0.001). Conclusion: In a large administrative database, early SSRF was associated with reduced adverse outcomes, as well as improved hospital length of stay and total cost. These data corroborate other research and suggest that early SSRF is preferred. Studies of outcomes after SSRF should stratify analyses by timing of procedure. Full article
(This article belongs to the Section Surgery)
Show Figures

Figure 1

14 pages, 1276 KB  
Article
Factors Governing the Erythropoietic Response to Intravenous Iron Infusion in Patients with Chronic Kidney Disease: A Retrospective Cohort Study
by Chukwuma A. Chukwu, Helen Gilbody, Olivia Wickens, Craig Carroll, Sunil Bhandari and Philip A. Kalra
Biomedicines 2023, 11(9), 2417; https://doi.org/10.3390/biomedicines11092417 - 29 Aug 2023
Cited by 2 | Viewed by 2932
Abstract
Background: Limited knowledge exists about factors affecting parenteral iron response. A study was conducted to determine the factors influencing the erythropoietic response to parenteral iron in iron-deficient anaemic patients whose kidney function ranged from normal through all stages of chronic kidney disease (CKD) [...] Read more.
Background: Limited knowledge exists about factors affecting parenteral iron response. A study was conducted to determine the factors influencing the erythropoietic response to parenteral iron in iron-deficient anaemic patients whose kidney function ranged from normal through all stages of chronic kidney disease (CKD) severity. Methods: This retrospective cohort study included parenteral iron recipients who did not receive erythropoiesis-stimulating agents (ESA) between 2017 and 2019. The study cohort was derived from two groups of patients: those managed by the CKD team and patients being optimised for surgery in the pre-operative clinic. Patients were categorized based on their kidney function: Patients with normal kidney function [estimated glomerular filtration rate (eGFR) ≥ 60 mL/min/1.73 m2] were compared to those with CKD stages 3–5 (eGFR < 60 mL/min/1.73 m2). Patients were further stratified by the type of iron deficiency [absolute iron deficiency (AID) versus functional iron deficiency (FID)]. The key outcome was change in hemoglobin (∆Hb) between pre- and post-infusion haemoglobin (Hb) values. Parenteral iron response was assessed using propensity-score matching and multivariate linear regression. The impact of kidney impairment versus the nature of iron deficiency (AID vs. FID) in response was explored. Results: 732 subjects (mean age 66 ± 17 years, 56% females and 87% White) were evaluated. No significant differences were observed in the time to repeat Hb among CKD stages and FID/AID patients. The Hb rise was significantly lower with lower kidney function (non-CKD and CKD1–2; 13 g/L, CKD3–5; 7 g/L; p < 0.001). When groups with different degrees of renal impairment were propensity-score matched according to whether iron deficiency was due to AID or FID, the level of CKD was found not to be relevant to Hb responses [unmatched (∆Hb) 12.1 vs. 8.7 g/L; matched (∆Hb) 12.4 vs. 12.1 g/L in non-CKD and CKD1–2 versus CKD3–5, respectively]. However, a comparison of patients with AID and FID, while controlling for the degree of CKD, indicated that patients with FID exhibited a diminished Hb response regardless of their level of kidney impairment. Conclusion: The nature of iron deficiency rather than the severity of CKD has a stronger impact on Hb response to intravenous iron with an attenuated response seen in functional iron deficiency irrespective of the degree of renal impairment. Full article
(This article belongs to the Special Issue Advances in Iron Deficiency and Iron-Related Disorders)
Show Figures

Graphical abstract

12 pages, 922 KB  
Article
Tocilizumab Outcomes in Critically Ill COVID-19 Patients Admitted to the ICU and the Role of Non-Tocilizumab COVID-19-Specific Medical Therapeutics
by Alyaa Elhazmi, Ahmed A. Rabie, Awad Al-Omari, Hani N. Mufti, Hend Sallam, Mohammed S. Alshahrani, Ahmed Mady, Adnan Alghamdi, Ali Altalaq, Mohamed H. Azzam, Anees Sindi, Ayman Kharaba, Zohair A. Al-Aseri, Ghaleb A. Almekhlafi, Wail Tashkandi, Saud A. Alajmi, Fahad Faqihi, Abdulrahman Alharthy, Jaffar A. Al-Tawfiq, Rami Ghazi Melibari and Yaseen M. Arabiadd Show full author list remove Hide full author list
J. Clin. Med. 2023, 12(6), 2301; https://doi.org/10.3390/jcm12062301 - 16 Mar 2023
Cited by 3 | Viewed by 3214
Abstract
Background: Tocilizumab is a monoclonal antibody proposed to manage cytokine release syndrome (CRS) associated with severe COVID-19. Previously published reports have shown that tocilizumab may improve the clinical outcomes of critically ill patients admitted to the ICU. However, no precise data about the [...] Read more.
Background: Tocilizumab is a monoclonal antibody proposed to manage cytokine release syndrome (CRS) associated with severe COVID-19. Previously published reports have shown that tocilizumab may improve the clinical outcomes of critically ill patients admitted to the ICU. However, no precise data about the role of other medical therapeutics concurrently used for COVID-19 on this outcome have been published. Objectives: We aimed to compare the overall outcome of critically ill COVID-19 patients admitted to the ICU who received tocilizumab with the outcome of matched patients who did not receive tocilizumab while controlling for other confounders, including medical therapeutics for critically ill patients admitted to ICUs. Methods: A prospective, observational, multicenter cohort study was conducted among critically ill COVID-19 patients admitted to the ICU of 14 hospitals in Saudi Arabia between 1 March 2020, and October 31, 2020. Propensity-score matching was utilized to compare patients who received tocilizumab to patients who did not. In addition, the log-rank test was used to compare the 28 day hospital survival of patients who received tocilizumab with those who did not. Then, a multivariate logistic regression analysis of the matched groups was performed to evaluate the impact of the remaining concurrent medical therapeutics that could not be excluded via matching 28 day hospital survival rates. The primary outcome measure was patients’ overall 28 day hospital survival, and the secondary outcomes were ICU length of stay and ICU survival to hospital discharge. Results: A total of 1470 unmatched patients were included, of whom 426 received tocilizumab. The total number of propensity-matched patients was 1278. Overall, 28 day hospital survival revealed a significant difference between the unmatched non-tocilizumab group (586; 56.1%) and the tocilizumab group (269; 63.1%) (p-value = 0.016), and this difference increased even more in the propensity-matched analysis between the non-tocilizumab group (466.7; 54.6%) and the tocilizumab group (269; 63.1%) (p-value = 0.005). The matching model successfully matched the two groups’ common medical therapeutics used to treat COVID-19. Two medical therapeutics remained significantly different, favoring the tocilizumab group. A multivariate logistic regression was performed for the 28 day hospital survival in the propensity-matched patients. It showed that neither steroids (OR: 1.07 (95% CI: 0.75–1.53)) (p = 0.697) nor favipiravir (OR: 1.08 (95% CI: 0.61–1.9)) (p = 0.799) remained as a predictor for an increase in 28 day survival. Conclusion: The tocilizumab treatment in critically ill COVID-19 patients admitted to the ICU improved the overall 28 day hospital survival, which might not be influenced by the concurrent use of other COVID-19 medical therapeutics, although further research is needed to confirm this. Full article
(This article belongs to the Special Issue Pulmonary and Critical Care Practice in the Pandemic of COVID-19)
Show Figures

Figure 1

12 pages, 1685 KB  
Article
Time Trends of Ventricular Reconstruction and Outcomes among Patients with Left Ventricular Thrombus and Aneurysms
by Boqun Shi, Xieraili Tiemuerniyazi, Rui Zhang, Chenxi Song, Kongyong Cui, Dong Zhang, Lei Jia, Dong Yin, Hongjian Wang, Weihua Song, Wei Feng and Kefei Dou
J. Cardiovasc. Dev. Dis. 2022, 9(12), 464; https://doi.org/10.3390/jcdd9120464 - 15 Dec 2022
Cited by 3 | Viewed by 2993
Abstract
Background: Clinical guidelines recommend surgical intervention when left ventricular thrombus (LVT) is complicated with left ventricular aneurysm (LVA). Objectives: This study aimed to review the changes in the treatment of LVT combined with LVA over the past 12 years at our center and [...] Read more.
Background: Clinical guidelines recommend surgical intervention when left ventricular thrombus (LVT) is complicated with left ventricular aneurysm (LVA). Objectives: This study aimed to review the changes in the treatment of LVT combined with LVA over the past 12 years at our center and to compare the efficacy of medical therapy and surgical treatment on patient outcomes. Methods: Between January 2009 and June 2021, 723 patients with LVT combined with LVA were enrolled, of whom 205 received surgical ventricular reconstruction (SVR) therapy and 518 received medical therapy. The following clinical outcomes were gathered via observation: all-cause death, cardiovascular death, and major adverse cardiovascular and cerebrovascular events (MACCEs; defined as the composite of cardiovascular death, ischemic stroke, and acute myocardial infarction). The median follow-up time was 1403 [707, 2402] days. Results: The proportion of SVR dropped yearly in this group of patients, from a peak of 64.5% in 2010 to 7.5% in 2021 (p for trend < 0.001). Meanwhile, the proportion of anticoagulant use increased quickly, from 8.0% in 2016 to 67.9% in 2021 (p for trend < 0.001). The incidence rates of all-cause mortality, cardiovascular death, and MACCEs were 12.9% (n = 93), 10.5% (n = 76), and 14.7% (n = 106), respectively. In the multivariable analysis, there were no significant differences in all-cause death (HR of 0.60, 95% CI of 0.32–1.13, p = 0.11), cardiovascular death (HR of 0.79, 95% CI of 0.41–1.50, p = 0.5), and MACCEs (HR of 0.82, 95% CI of 0.49–1.38, p = 0.5) between the two groups. The competing risk regression performed in the propensity score matching (PSM) and inverse probability of treatment weighting (IPTW) analyses was in line with the unmatched analysis. Conclusions: The rate of SVR dropped significantly among patients with both LVT and LVA, while there was an improvement in oral anticoagulant utilization. SVR with thrombus removal did not improve all-cause mortality and cardiovascular outcomes in patients with LVT and LVA. Ventricular aneurysm with thrombus may not be an indication for surgery. Full article
(This article belongs to the Special Issue Cardiac Surgery: Outcomes, Management and Critical Care)
Show Figures

Figure 1

12 pages, 770 KB  
Article
Practice of Awake Prone Positioning in Critically Ill COVID-19 Patients—Insights from the PRoAcT–COVID Study
by Willemke Stilma, Christel M. A. Valk, David M. P. van Meenen, Luis Morales, Daantje Remmelzwaal, Sheila N. Myatra, Antonio Artigas, Ary Serpa Neto, Frederique Paulus and Marcus J. Schultz
J. Clin. Med. 2022, 11(23), 6988; https://doi.org/10.3390/jcm11236988 - 26 Nov 2022
Cited by 4 | Viewed by 3113
Abstract
We describe the incidence, practice and associations with outcomes of awake prone positioning in patients with acute hypoxemic respiratory failure due to coronavirus disease 2019 (COVID-19) in a national multicenter observational cohort study performed in 16 intensive care units in the Netherlands (PRoAcT–COVID-study). [...] Read more.
We describe the incidence, practice and associations with outcomes of awake prone positioning in patients with acute hypoxemic respiratory failure due to coronavirus disease 2019 (COVID-19) in a national multicenter observational cohort study performed in 16 intensive care units in the Netherlands (PRoAcT–COVID-study). Patients were categorized in two groups, based on received treatment of awake prone positioning. The primary endpoint was practice of prone positioning. Secondary endpoint was ‘treatment failure’, a composite of intubation for invasive ventilation and death before day 28. We used propensity matching to control for observed confounding factors. In 546 patients, awake prone positioning was used in 88 (16.1%) patients. Prone positioning started within median 1 (0 to 2) days after ICU admission, sessions summed up to median 12.0 (8.4–14.5) hours for median 1.0 day. In the unmatched analysis (HR, 1.80 (1.41–2.31); p < 0.001), but not in the matched analysis (HR, 1.17 (0.87–1.59); p = 0.30), treatment failure occurred more often in patients that received prone positioning. The findings of this study are that awake prone positioning was used in one in six COVID-19 patients. Prone positioning started early, and sessions lasted long but were often discontinued because of need for intubation. Full article
(This article belongs to the Section Epidemiology & Public Health)
Show Figures

Figure 1

10 pages, 854 KB  
Article
Triglyceride–Glucose Index May Predict Renal Survival in Patients with IgA Nephropathy
by Aiya Qin, Jiaxing Tan, Siqing Wang, Lingqiu Dong, Zheng Jiang, Dandan Yang, Huan Zhou, Xiaoyuan Zhou, Yi Tang and Wei Qin
J. Clin. Med. 2022, 11(17), 5176; https://doi.org/10.3390/jcm11175176 - 1 Sep 2022
Cited by 12 | Viewed by 3357
Abstract
Background: The triglyceride–glucose (TyG) index is a simple, novel and reliable surrogate marker of insulin resistance. However, evidence for the prognostic impact of an elevated TyG index on IgA nephropathy (IgAN) is limited. Therefore, we evaluated the relationship between the TyG index and [...] Read more.
Background: The triglyceride–glucose (TyG) index is a simple, novel and reliable surrogate marker of insulin resistance. However, evidence for the prognostic impact of an elevated TyG index on IgA nephropathy (IgAN) is limited. Therefore, we evaluated the relationship between the TyG index and the risk of renal progression in IgAN. Method: This cohort study involved biopsy-proven IgAN between January 2009 and December 2018 in West China Hospital, in which patients were assigned to two groups based on the cut-off value of TyG using receiver operating characteristic (ROC) curves. A 1:1 matched-pair analysis was established to optimize the bias in IgAN by propensity score matching (PSM). The TyG index was calculated as ln [fasting triglyceride (mg/dL) × fasting glucose (mg/dL)/2]. The composite endpoint was defined by eGFR decreased ≥50% of the baseline level, end-stage kidney disease (ESKD), renal transplantation and/or death. Univariable and multivariable Cox proportional hazard models were applied to confirm the predictive value of the optimal marker. Results: Before PSM, a total of 1210 participants were ultimately included. During a median follow-up period of 55.8 months (range 37.20–79.09 months), 129 participants progressed to the composite endpoint (10.7%). After PSM, 366 patients were enrolled in the matched cohort, of whom 34 (9.3%) patients reached the endpoints. Based on the cut-off value of the TyG index, patients were divided into the low TyG index group (TyG ≤ 8.72, n = 690) and the high TyG index group (TyG > 8.72, n = 520). Further analysis demonstrated that a higher TyG index was significantly associated with a higher risk of reaching composite endpoints in IgAN patients in both the unmatched and matched cohorts (before PSM: HR 2.509, 95% CI 1.396–4.511, p = 0.002; after PSM: HR 2.654, 95% CI 1.299–5.423, p = 0.007). Conclusion: A high TyG index is associated with a higher risk of renal progression. Full article
(This article belongs to the Special Issue Innate Immunity Nephropathy: Etiology, Diagnosis, and Treatment)
Show Figures

Figure 1

15 pages, 2606 KB  
Article
Plasma Transfusion in Septic Shock—A Secondary Analysis of a Retrospective Single-Center Cohort
by Maximilian Dietrich, Tobias Hölle, Lazar Detelinov Lalev, Martin Loos, Felix Carl Fabian Schmitt, Mascha Onida Fiedler, Thilo Hackert, Daniel Christoph Richter, Markus Alexander Weigand and Dania Fischer
J. Clin. Med. 2022, 11(15), 4367; https://doi.org/10.3390/jcm11154367 - 27 Jul 2022
Cited by 8 | Viewed by 3557
Abstract
In sepsis, both beneficial and detrimental effects of fresh frozen plasma (FFP) transfusion have been reported. The aim of this study was to analyze the indication for and effect of FFP transfusion in patients with septic shock. We performed a secondary analysis of [...] Read more.
In sepsis, both beneficial and detrimental effects of fresh frozen plasma (FFP) transfusion have been reported. The aim of this study was to analyze the indication for and effect of FFP transfusion in patients with septic shock. We performed a secondary analysis of a retrospective single-center cohort of all patients treated for septic shock at the interdisciplinary surgical intensive care unit (ICU) of the Heidelberg University Hospital. Septic shock was defined according to sepsis-3 criteria. To assess the effects of FFP administration in the early phase of septic shock, we compared patients with and without FFP transfusion during the first 48 h of septic shock. Patients who died during the first 48 h of septic shock were excluded from the analysis. Primary endpoints were 30- and 90-day mortality. A total of 261 patients were identified, of which 100 (38.3%) received FFP transfusion within the first 48 h after septic shock onset. The unmatched analysis showed a trend toward higher 30- and 90-d mortality in the FFP group (30 d: +7% p = 0.261; 90 d: +11.9% p = 0.061). In the propensity-matched analysis, 30- and 90-day mortality were similar between groups. Plasma administration did not influence fluid or vasopressor need, lactate levels, ICU stay, or days on a ventilator. We found no significant harm or associated benefit of FFP use in the early phase of septic shock. Finally, plasma should only be used in patients with a strong indication according to current recommendations, as a conclusive evaluation of the risk-benefit ratio for plasma transfusion in septic shock cannot be made based on the current data. Full article
(This article belongs to the Special Issue Patient Blood Management in Critical Care Medicine)
Show Figures

Figure 1

16 pages, 9574 KB  
Article
Real-World Clinical Outcomes after Genomic Profiling of Circulating Tumor DNA in Patients with Previously Treated Advanced Non-Small Cell Lung Cancer
by Steven Olsen, Jiemin Liao and Hidetoshi Hayashi
Curr. Oncol. 2022, 29(7), 4811-4826; https://doi.org/10.3390/curroncol29070382 - 8 Jul 2022
Cited by 14 | Viewed by 4537
Abstract
Comprehensive genomic profiling for advanced non-small cell lung cancer (NSCLC) can identify patients for molecularly targeted therapies that improve clinical outcomes. We analyzed data from 3084 patients (median age 65 years, 72.9% with adenocarcinoma) with advanced NSCLC registered in a real-world healthcare claims [...] Read more.
Comprehensive genomic profiling for advanced non-small cell lung cancer (NSCLC) can identify patients for molecularly targeted therapies that improve clinical outcomes. We analyzed data from 3084 patients (median age 65 years, 72.9% with adenocarcinoma) with advanced NSCLC registered in a real-world healthcare claims database (GuardantINFORMTM, Guardant Health) who underwent next-generation sequencing (NGS)-based circulating tumor DNA (ctDNA) testing (Guardant360®, Guardant Health) after first-line therapy (28.0% with agents targeted against genomic alterations). ctDNA was detected in 2771 samples (89.9%), of which 41.9% harbored actionable alterations, most commonly EGFR (epidermal growth factor receptor) mutations (29.7%). Actionable alterations were detected in 26.7% of patients (534/2001) previously treated with non-targeted agents. Emerging potentially targetable mutations were found in 40.1% (309/770) of patients previously treated with targeted therapies. Among patients with qualifying alterations detected by ctDNA testing, the time to treatment discontinuation (median 8.8 vs. 4.2 months; hazard ratio 1.97, p < 0.001) and overall survival (median 36.1 vs. 16.6 months; hazard ratio 2.08, p < 0.001) were longer for those who received matched second-line therapy versus unmatched second-line therapy. In real-world practice, results of a blood-based NGS assay prior to second-line treatment inform therapeutic decisions that can improve clinical outcomes for patients with advanced NSCLC. Full article
(This article belongs to the Section Thoracic Oncology)
Show Figures

Figure 1

13 pages, 938 KB  
Article
Stereotactic Body Radiation Therapy versus Concurrent Chemoradiotherapy for Locally Advanced Pancreatic Cancer: A Propensity Score-Matched Analysis
by Young Seob Shin, Hee Hyun Park, Jin-hong Park, Dong-Wan Seo, Sang Soo Lee, Changhoon Yoo, Seonok Kim, Sang Min Yoon, Jinhong Jung, Myung-Hwan Kim, Sung Koo Lee, Do Hyun Park, Tae Jun Song, Dongwook Oh, Baek-Yeol Ryoo, Heung-Moon Chang, Kyu-pyo Kim, Jae Ho Jeong and Jong Hoon Kim
Cancers 2022, 14(5), 1166; https://doi.org/10.3390/cancers14051166 - 24 Feb 2022
Cited by 7 | Viewed by 2979
Abstract
In locally advanced pancreatic cancer (LAPC), stereotactic body radiation therapy (SBRT) has been applied as an alternative to concurrent chemoradiotherapy (CCRT); however, direct comparative evidence between these two modalities is scarce. The aim of this study was to compare the clinical outcomes of [...] Read more.
In locally advanced pancreatic cancer (LAPC), stereotactic body radiation therapy (SBRT) has been applied as an alternative to concurrent chemoradiotherapy (CCRT); however, direct comparative evidence between these two modalities is scarce. The aim of this study was to compare the clinical outcomes of SBRT with CCRT for LAPC. We retrospectively reviewed the medical records of patients with LAPC who received SBRT (n = 95) or CCRT (n = 66) with a concurrent 5-FU-based regimen between January 2008 and July 2016. The clinical outcomes of freedom from local progression (FFLP), progression-free survival (PFS), overall survival (OS), and toxicities were analyzed before and after propensity score (PS) matching. After a median follow-up duration of 15.5 months (range, 2.3–64.5), the median OS, PFS, and FFLP of the unmatched patients were 17.3 months, 11 months, and 19.6 months, respectively. After PS matching, there were no significant differences between the SBRT and CCRT groups in terms of the 1-year rates of OS (66.7% vs. 80%, p = 0.455), PFS (40.0% vs. 54.2%, p = 0.123), and FFLP (77.2% and 87.1%, p = 0.691). Our results suggest SBRT could be a feasible alternative to CCRT in treating patients with LAPC. Full article
(This article belongs to the Special Issue Frontiers in Pancreatic Cancer)
Show Figures

Figure 1

Back to TopTop