Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (3,546)

Search Parameters:
Keywords = probability density

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 8657 KB  
Article
Hazard Assessment of Shallow Loess Landslides Under Different Rainfall Intensities Based on the SINMAP Model: A Case Study of Yuzhong County
by Peng Wang, Hongwei Teng, Mingyuan Wang, Yahong Deng, Fan Liu and Huandong Mu
Appl. Sci. 2025, 15(21), 11556; https://doi.org/10.3390/app152111556 - 29 Oct 2025
Abstract
The Loess Plateau is one of the most landslide-prone regions in China, where rainfall-induced shallow loess landslides severely constrain regional economic and social development. Therefore, investigating the stability of shallow loess slopes under rainfall conditions is of great significance. Taking Yuzhong County in [...] Read more.
The Loess Plateau is one of the most landslide-prone regions in China, where rainfall-induced shallow loess landslides severely constrain regional economic and social development. Therefore, investigating the stability of shallow loess slopes under rainfall conditions is of great significance. Taking Yuzhong County in Gansu Province as an example, this study uses the SINMAP model (Version 2.0) to assess slope stability. The areas of unstable zones under different rainfall intensities were identified, and the spatial distribution of hazard sites was analyzed to evaluate the applicability of this deterministic physical model in the study area. Furthermore, a Personnel Risk Level (PRL) determined by combining population density with the Stability Index (SI, defined as the probability that the factor of safety exceeds 1: SI = Prob (FS > 1)) was proposed and applied to assess the potential impact of landslides on local residents. The novelty of this study lies in three aspects: (1) targeting Yuzhong County (a loess region with scarce comprehensive landslide risk assessments) to fill the regional research gap, (2) quantifying PRL through a modified hazard index (HI = population density × (1/SI)) to achieve spatialized risk mapping for vulnerable populations, and (3) systematically analyzing the dynamic response of slope stability to five gradient rainfall intensities (from light rain to severe rainstorm) and verifying model sensitivity to key parameters. The results show that as rainfall intensity increases, stable areas gradually decrease while unstable areas expand, with stable zones progressively transforming into unstable ones. Greater rainfall intensity also leads to an increase in the number of landslides within unstable zones. The proposed PRL helps delineate the severity of hazards in different townships, providing new references for mitigating casualties and property losses caused by landslides. Full article
Show Figures

Figure 1

33 pages, 7484 KB  
Article
Effect of E-Beam and X-Ray Irradiation on Radiation–Chemical Yield and Reaction Rate of Volatile Organic Compound Transformations
by Victoria Ipatova, Ulyana Bliznyuk, Polina Borshchegovskaya, Timofey Bolotnik, Alexander Chernyaev, Igor Gloriozov, Elena Kozlova, Alexander Nikitchenko, Anastasia Oprunenko, Mariya Toropygina, Irina Ananieva and Igor Rodin
Molecules 2025, 30(21), 4226; https://doi.org/10.3390/molecules30214226 - 29 Oct 2025
Abstract
This study investigates the impact of 1 MeV electron beam and 80 keV X-ray irradiation on the decomposition rate and radiation–chemical yield of 1-hexanol in aqueous saline solution to develop a comprehensive approach to determining reliable volatile organic compound markers for food irradiation. [...] Read more.
This study investigates the impact of 1 MeV electron beam and 80 keV X-ray irradiation on the decomposition rate and radiation–chemical yield of 1-hexanol in aqueous saline solution to develop a comprehensive approach to determining reliable volatile organic compound markers for food irradiation. A 50 mg/L 1-hexanol solution was irradiated with the doses ranging from 100 to 8000 Gy at various dose rates ranging from 0.2 to 10 Gy/s to assess the impact of irradiation parameters on the decomposition rate and radiation–chemical yield of volatile compounds typically found in food. GC–MS analysis revealed a non-linear decrease in 1-hexanol concentration with increasing dose, accompanied by the formation of aldehydes, ketones, and secondary alcohols. Among these products, hexanal was detected at the lowest applied dose and exhibited dose-dependent behavior that correlated strongly with 1-hexanol degradation. Density functional theory calculations identified the most probable pathways for the formation of hexanol decomposition products, involving direct ionization, radical reactions, and oxidation. A mathematical model proposed in the study describes dose-dependent transformations of 1-hexanol into hexanal, enabling quantitative estimation of the degradation extent of hexanol. The findings suggest that hexanal can serve as a quantitative marker for hexanol degradation, supporting the development of rapid “dose range” determination methods for food irradiation that ensure microbial safety while minimizing undesirable oxidation of proteins, fats, and carbohydrates. Full article
(This article belongs to the Special Issue Analysis of Natural Volatile Organic Compounds (NVOCs))
Show Figures

Figure 1

20 pages, 543 KB  
Article
Pulse Consumption and Metabolic Syndrome: Findings from the Hispanic Community Health Study/Study of Latinos
by Juliana Teruel Camargo, Gabriela Recinos, Amanda S. Hinerman, Chelsea Duong, Erik J. Rodriquez, Jordan J. Juarez, Amanda C. McClain, Sarah K. Alver, Martha L. Daviglus, Linda Van Horn and Eliseo J. Pérez-Stable
Nutrients 2025, 17(21), 3392; https://doi.org/10.3390/nu17213392 - 29 Oct 2025
Abstract
Background/Objectives: Metabolic syndrome affects half of middle-aged (ages 45–64) Hispanic or Latino (Latino) adults. Pulses, fiber-rich plant proteins common in Latino diets (e.g., dry beans and lentils), may mitigate metabolic syndrome. We evaluated the association between pulse intake and metabolic syndrome. Methods [...] Read more.
Background/Objectives: Metabolic syndrome affects half of middle-aged (ages 45–64) Hispanic or Latino (Latino) adults. Pulses, fiber-rich plant proteins common in Latino diets (e.g., dry beans and lentils), may mitigate metabolic syndrome. We evaluated the association between pulse intake and metabolic syndrome. Methods: We analyzed data from 6,958 adults aged ≥ 50 in the Hispanic Community Health Study/Study of Latinos (2008–2011) Visit 1. Pulse intake was assessed using two 24 h dietary recalls and categorized into no, low (<1/2 cup), moderate (≥1/2 to 3/4 cup), and high pulse (>3/4 cup) daily intake groups. Metabolic syndrome was defined by criteria including blood pressure ≥130/85 mmHg or medication use, triglycerides ≥150 mg/dL or medication use, high-density lipoprotein cholesterol (men <40 mg/dL and women <50 mg/dL), and waist circumference (men ≥102 cm and women ≥88 cm). We used multivariate logistic regression models with predicted probability proportions to assess the association adjusted for sociodemographic factors, acculturation, diet quality, energy intake, and physical activity. Results: Of the 6,958 participants, 53.1% had metabolic syndrome and 53.4% had a moderate or high pulse intake. Pulse intake varied, where 19.4% had a high intake, 33.9% had a moderate intake, 12.5% had a low intake, and 34.2% had no intake. Moderate (predicted marginal = 0.52, 95% confidence interval [CI] = 0.49, 0.55) and high (predicted marginal = 0.49, 95%CI = 0.45, 0.53) intakes were associated with a lower prevalence of metabolic syndrome. Conclusions: Among Latino adults ≥50 years old, a moderate or high pulse intake was associated with a lower prevalence of metabolic syndrome. Increasing the pulse intake in the population may be linked to reduced metabolic syndrome. Full article
(This article belongs to the Section Nutrition and Public Health)
Show Figures

Figure 1

24 pages, 5039 KB  
Article
Diet Reconstruction Under Limited Prior Information: Dietary Contributions and Isotopic Niche of Metridium senile in the North Yellow Sea
by Yongsong Zhao, Xiujuan Shan, Guangliang Teng, Shiqi Song, Yunlong Chen and Xianshi Jin
Biology 2025, 14(11), 1508; https://doi.org/10.3390/biology14111508 - 28 Oct 2025
Abstract
Biomass of the plumose anemone Metridium senile has surged in the benthic ecosystem of the North Yellow Sea in recent years. Understanding its diet and the proportional contributions of food sources is essential for assessing the ecological consequences of this expansion. The species [...] Read more.
Biomass of the plumose anemone Metridium senile has surged in the benthic ecosystem of the North Yellow Sea in recent years. Understanding its diet and the proportional contributions of food sources is essential for assessing the ecological consequences of this expansion. The species is often characterized as a passive suspension feeder, yet laboratory feeding trials have documented shrimp consumption. Because prior dietary information from the region is scarce, conventional stable isotope approaches are poorly constrained. We developed an integrative framework coupling trophic position estimation, isotopic niche metrics, spatial point pattern analysis, and a Bayesian mixing model to improve diet attribution under limited prior information and to test whether M. senile preys on small-bodied and juvenile teleosts and invertebrates under natural conditions. Our analyses showed that: (i) M. senile occupied a high trophic position (TP = 3.09 ± 0.25), exceeding those estimated for putative predators in our dataset, implying weak top-down control; (ii) in isotopic niche analyses, M. senile showed high posterior probabilities of occurring within the niches of cephalopods and medium-sized fishes (78.30% and 63.04%, respectively), consistent with shared prey and inconsistent with a strictly suspension-feeding strategy; (iii) mixing space diagnostics informed by spatial point pattern analysis indicated that including small-sized fishes and shrimps as sources was necessary to reconcile the elevated TP; and (iv) the Bayesian mixing model estimated that small-bodied and juvenile teleosts and invertebrates supplied most long-term nutrition (posterior mean ≈ 0.65), with the remainder from suspension-derived sources, consistent with an opportunistic generalist rather than a strict suspension feeder. Sustained predation on small-bodied and juvenile teleosts and invertebrates could suppress early fish recruitment, impose top-down control on forage species, and alter the local food web structure. Management should monitor M. senile (size structure, population density, and co-occurrence with juveniles and forage biota) and consider targeted removals and seafloor litter cleanups in priority habitats. The framework is applicable to diet studies with limited prior information; adding δ34S, compound-specific amino-acid isotopes (CSIA-AA), and DNA-based dietary evidence should further sharpen source discrimination. Full article
Show Figures

Graphical abstract

29 pages, 589 KB  
Article
Numerical Modeling of a Gas–Particle Flow Induced by the Interaction of a Shock Wave with a Cloud of Particles
by Konstantin Volkov
Mathematics 2025, 13(21), 3427; https://doi.org/10.3390/math13213427 - 28 Oct 2025
Abstract
A continuum model for describing pseudo-turbulent flows of a dispersed phase is developed using a statistical approach based on the kinetic equation for the probability density of particle velocity and temperature. The introduction of the probability density function enables a statistical description of [...] Read more.
A continuum model for describing pseudo-turbulent flows of a dispersed phase is developed using a statistical approach based on the kinetic equation for the probability density of particle velocity and temperature. The introduction of the probability density function enables a statistical description of the particle ensemble through equations for the first and second moments, replacing the dynamic description of individual particles derived from Langevin-type equations of motion and heat transfer. The lack of detailed dynamic information on individual particle behavior is compensated by a richer statistical characterization of the motion and heat transfer within the particle continuum. A numerical simulation of the unsteady flow of a gas–particle suspension generated by the interaction of a shock wave with a particle cloud is performed using an interpenetrating continua model and equations for the first and second moments of both gas and particles. Numerical methods for solving the two-phase gas dynamics equations—formulated using a two-velocity and two-temperature model—are discussed. Each phase is governed by conservation equations for mass, momentum, and energy, written in a conservative hyperbolic form. These equations are solved using a high-order Godunov-type numerical method, with time discretization performed by a third-order Runge–Kutta scheme. The study analyzes the influence of two-dimensional effects on the formation of shock-wave flow structures and explores the spatial and temporal evolution of particle concentration and other flow parameters. The results enable an estimation of shock wave attenuation by a granular backfill. The extended pressure relaxation region is observed behind the cloud of particles. Full article
(This article belongs to the Special Issue Numerical Methods and Analysis for Partial Differential Equations)
Show Figures

Figure 1

18 pages, 3698 KB  
Article
A Temporal Validation Study of Diagnostic Prediction Models for the Screening of Elevated Low-Density and Non-High-Density Lipoprotein Cholesterol
by Wuttipat Kiratipaisarl, Vithawat Surawattanasakul, Wachiranun Sirikul and Phichayut Phinyo
J. Clin. Med. 2025, 14(21), 7617; https://doi.org/10.3390/jcm14217617 - 27 Oct 2025
Viewed by 145
Abstract
Background/Objectives: Limited accessibility to hypercholesterolemia diagnosis hinders the primary prevention of cardiovascular disease. Therefore, we conducted a prospective, temporal validation study of two diagnostic prediction models, targeting endpoints of elevated low-density lipoprotein cholesterol (LDL-C, ≥160 mg/dL) and non-high-density lipoprotein cholesterol (non-HDL-C, ≥160 [...] Read more.
Background/Objectives: Limited accessibility to hypercholesterolemia diagnosis hinders the primary prevention of cardiovascular disease. Therefore, we conducted a prospective, temporal validation study of two diagnostic prediction models, targeting endpoints of elevated low-density lipoprotein cholesterol (LDL-C, ≥160 mg/dL) and non-high-density lipoprotein cholesterol (non-HDL-C, ≥160 mg/dL). Methods: We prospectively recruited workers aged 20–40 years from a single-center, university hospital from March to June 2024 (n = 1099). We determined two diagnostic endpoints: elevated LDL-C and non-HDL-C. The predicted probabilities were derived from the binary logistic regression based on gender, metabolic age, and diastolic blood pressure. We assessed three prediction performances: discrimination from area under the receiver-operating characteristic curve (AuROC); calibration slope (C-slope) and calibration-in-the-large (CITL) from the calibration plot; clinical net benefit from decision curve analysis. Recalibration was based on C-slope and CITL, with a socioeconomic subgroup fairness assessment of AuROC, C-slope, and CITL. Results: From 1099 eligible participants, we identified 135 (12.3%) elevated LDL-C and 251 (22.8%) elevated non-HDL-C cases. The LDL-C model had poor discrimination (AuROC 0.59; 95%-CI, 0.56–0.62), miscalibration (C-slope 0.64; 95%-CI, 0.39–0.88 and CITL −0.14; 95%-CI, −0.27–−0.02), and negligible investigation reduction. The non-HDL-C model had fair discrimination (AuROC 0.67; 95%-CI, 0.64–0.69), miscalibration (C-slope 0.71; 95%-CI, 0.59–0.83 and CITL −0.07; 95%-CI, −0.17–0.03), and 20% investigation reduction at prevalence threshold probability. Updated model fairness improved compared to the original models. Conclusions: Temporal validation demonstrated modest replicability for the elevated non-HDL-C model, with a potential limitation in participants with normal BMI but low muscle and high fat mass. Health practitioners may use updated elevated non-HDL-C models as a non-invasive triage strategy in young adults, with threshold probabilities within the positive clinical net benefit ranges. Further external validation studies in a larger and more diverse population are necessary. Full article
(This article belongs to the Special Issue Clinical Updates on Dyslipidemia)
Show Figures

Graphical abstract

19 pages, 13081 KB  
Article
A Spatiotemporal Wildfire Risk Prediction Framework Integrating Density-Based Clustering and GTWR-RFR
by Shaofeng Xie, Huashun Xiao, Gui Zhang and Haizhou Xu
Forests 2025, 16(11), 1632; https://doi.org/10.3390/f16111632 - 26 Oct 2025
Viewed by 199
Abstract
Accurate wildfire prediction and identification of key environmental drivers are critical for effective wildfire management. We propose a spatiotemporally adaptive framework integrating ST-DBSCAN clustering with GTWR-RFR. In this hybrid model, Random Forest captures local nonlinear relationships, while GTWR assigns adaptive spatiotemporal weights to [...] Read more.
Accurate wildfire prediction and identification of key environmental drivers are critical for effective wildfire management. We propose a spatiotemporally adaptive framework integrating ST-DBSCAN clustering with GTWR-RFR. In this hybrid model, Random Forest captures local nonlinear relationships, while GTWR assigns adaptive spatiotemporal weights to refine predictions. Using historical wildfire records from Hunan Province, China, we first derived wildfire occurrence probabilities via ST-DBSCAN, avoiding the need for artificial non-fire samples. We then benchmarked GTWR-RFR against seven models, finding that our approach achieved the highest accuracy (R2 = 0.969; RMSE = 0.1743). The framework effectively captures spatiotemporal heterogeneity and quantifies dynamic impacts of environmental drivers. Key contributing drivers include DEM, GDP, population density, and distance to roads and water bodies. Risk maps reveal that central and southern Hunan are at high risk during winter and early spring. Our approach enhances both predictive performance and interpretability, offering a replicable methodology for data-driven wildfire risk assessment. Full article
(This article belongs to the Special Issue Ecological Monitoring and Forest Fire Prevention)
Show Figures

Figure 1

23 pages, 882 KB  
Article
A Gauss Hypergeometric-Type Model for Heavy-Tailed Survival Times in Biomedical Research
by Jiju Gillariose, Mahmoud M. Abdelwahab, Joshin Joseph and Mustafa M. Hasaballah
Symmetry 2025, 17(11), 1795; https://doi.org/10.3390/sym17111795 - 24 Oct 2025
Viewed by 175
Abstract
In this study, we introduced and analyzed the Slash–Log–Logistic (SlaLL) distribution, a novel statistical model developed by applying the slash methodology to log–logistic and beta distributions. The SlaLL distribution is particularly suited for modeling datasets characterized by heavy tails and extreme [...] Read more.
In this study, we introduced and analyzed the Slash–Log–Logistic (SlaLL) distribution, a novel statistical model developed by applying the slash methodology to log–logistic and beta distributions. The SlaLL distribution is particularly suited for modeling datasets characterized by heavy tails and extreme values, frequently encountered in survival time analyses. We derived the mathematical representation of the distribution involving Gauss hypergeometric and beta functions, explicitly established the probability density function, cumulative distribution function, hazard rate function, and reliability function, and provided clear definitions of its moments. Through comprehensive simulation studies, the accuracy and robustness of maximum likelihood and Bayesian methods for parameter estimation were validated. Comparative empirical analyses demonstrated the SlaLL distribution’s superior fitting performance over well-known slash-based models, emphasizing its practical utility in accurately capturing the complexities of real-world survival time data. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

16 pages, 1110 KB  
Article
Forecasting the U.S. Renewable-Energy Mix with an ALR-BDARMA Compositional Time-Series Framework
by Harrison Katz and Thomas Maierhofer
Forecasting 2025, 7(4), 62; https://doi.org/10.3390/forecast7040062 - 23 Oct 2025
Viewed by 188
Abstract
Accurate forecasts of the U.S. renewable energy consumption mix are essential for planning transmission upgrades, sizing storage, and setting balancing market rules. We introduce a Bayesian Dirichlet ARMA model (BDARMA) tailored to monthly shares of hydro, geothermal, solar, wind, wood, municipal waste, and [...] Read more.
Accurate forecasts of the U.S. renewable energy consumption mix are essential for planning transmission upgrades, sizing storage, and setting balancing market rules. We introduce a Bayesian Dirichlet ARMA model (BDARMA) tailored to monthly shares of hydro, geothermal, solar, wind, wood, municipal waste, and biofuels from January 2010 through January 2025. The mean vector is modeled with a parsimonious VAR(2) in additive log ratio space, while the Dirichlet concentration parameter follows an intercept plus five Fourier harmonics, allowing for seasonal widening and narrowing of predictive dispersion. Forecast performance is assessed with a 61-split rolling origin experiment that issues twelve month density forecasts from January 2019 to January 2024. Compared with three alternatives (a Gaussian VAR(2) fitted in transform space, a seasonal naive approach that repeats last year’s proportions, and a drift-free ALR random walk), BDARMA lowers the mean continuous ranked probability score by 15 to 60 percent, achieves componentwise 90 percent interval coverage near nominal, and maintains point accuracy (Aitchison RMSE) on par with the Gaussian VAR through eight months and within 0.02 units afterward. These results highlight BDARMA’s ability to deliver sharp and well-calibrated probabilistic forecasts for multivariate renewable energy shares without sacrificing point precision. Full article
(This article belongs to the Collection Energy Forecasting)
Show Figures

Figure 1

22 pages, 1040 KB  
Article
ROC Calculation for Burst Traffic Packet Detection—An Old Problem, Newly Revised
by Marco Krondorf
Signals 2025, 6(4), 57; https://doi.org/10.3390/signals6040057 - 23 Oct 2025
Viewed by 117
Abstract
Burst traffic radio systems use short signal bursts, which are prepended with an a priori known preamble sequence. The burst receivers exploit these preamble sequences for burst start detection. The process of burst start detection is commonly known as Packet Detection (PD), which [...] Read more.
Burst traffic radio systems use short signal bursts, which are prepended with an a priori known preamble sequence. The burst receivers exploit these preamble sequences for burst start detection. The process of burst start detection is commonly known as Packet Detection (PD), which employs preamble sequence cross-correlation and threshold detection. One major figure of merit for PD performance is the so-called ROC—receiver operating characteristics. ROC describes the trade-off between the probability of missed detection vs. the probability of false alarm. This article describes how to calculate the ROC for specified preamble sequences by deriving the probability density function (PDF) of the cross-correlation metric. We address this long-standing problem in the context of LEO (low Earth orbit) satellite systems, where differentially modulated PN (pseudo-noise) sequences are used for packet detection. For this particular class of preamble signals, the standard Ricean PDF assumption no longer holds and needs to be revised accordingly within this article. Full article
(This article belongs to the Special Issue Recent Development of Signal Detection and Processing)
Show Figures

Figure 1

13 pages, 983 KB  
Article
Potential Role of Transferrin and Vascular Cell Adhesion Molecule 1 in Differential Diagnosis Among Patients with Tauopathic Atypical Parkinsonian Syndromes
by Natalia Madetko-Alster, Dagmara Otto-Ślusarczyk, Marta Struga, Patryk Chunowski and Piotr Alster
Diagnostics 2025, 15(21), 2676; https://doi.org/10.3390/diagnostics15212676 - 23 Oct 2025
Viewed by 165
Abstract
Background/Objectives: Transferrin is a multi-task protein commonly known for binding iron; however, it is involved in multiple crucial processes, including antimicrobial activity, the growth of different cell types, differentiation, chemotaxis, the cell cycle, and cytoprotection. Vascular cell adhesion molecule 1 (VCAM-1) is a [...] Read more.
Background/Objectives: Transferrin is a multi-task protein commonly known for binding iron; however, it is involved in multiple crucial processes, including antimicrobial activity, the growth of different cell types, differentiation, chemotaxis, the cell cycle, and cytoprotection. Vascular cell adhesion molecule 1 (VCAM-1) is a cell surface glycoprotein which participates in inflammation and the trans-endothelial movement of leukocytes. Neither transferrin nor VCAM-1 has been studied in the context of progressive supranuclear palsy (PSP) or corticobasal syndrome (CBS). This study aimed to evaluate the utility of transferrin and VCAM-1 assessment for the in vivo examination of tauopathic atypical Parkinsonian syndromes. Methods: This study included 10 patients with clinically probable PSP-RS, 10 with clinically probable PSP-P, and 8 with probable CBS. Patients’ blood and urine were collected and analyzed. Twenty-four serum samples (from twelve males and twelve females) were obtained from age-matched healthy volunteers. Peripheral blood inflammatory ratios, including the neutrophil-to-lymphocyte ratio, the platelet-to-lymphocyte ratio, the neutrophil-to-monocyte ratio, the neutrophil-to-high-density lipoprotein ratio, and the monocyte-to-high-density lipoprotein ratio, were calculated. VCAM-1 and transferrin concentrations were measured in the serum and urine. The urinary biomarker results are not included in the main analysis due to the absence of a control group. Results: The highest concentrations of transferrin in the serum were observed in patients with PSP-P, followed by PSP-RS and CBS. Statistically significant differences were found between PSP-P and healthy controls (p < 0.0001) and PSP-RS and healthy controls (p < 0.0001). The highest levels of serum VCAM-1 were observed in the PSP-P group. Significant differences were found between PSP-P and healthy controls (p < 0.0001), PSP-P and CBS (p < 0.001), and PSP-RS and healthy controls (p < 0.001). Serum VCAM-1 levels were negatively correlated with the NLR in CBS patients (p < 0.03; r = −0.74). Serum transferrin levels were negatively correlated with the NHR in CBS patients (p < 0.04; r = −0.64). ROC curve analyses were conducted to evaluate the diagnostic utility of serum transferrin and VCAM-1 in distinguishing tauopathic APS patients from controls. Transferrin showed excellent diagnostic performance, with an AUC of 0.975 (95% CI: 0.888–0.999; p < 0.0001), a sensitivity of 96.4%, and a specificity of 95.8% at the optimal cut-off (>503.0). VCAM-1 demonstrated good accuracy, with an AUC of 0.839 (95% CI: 0.711–0.926; p < 0.0001), a sensitivity of 75.0%, and a specificity of 91.7% at the optimal cut-off (>463.9). Conclusions: The obtained results indicate the potential role of transferrin and VCAM-1 in the pathogenesis of tauopathic APSs and highlight the need for further exploration in this field. Full article
(This article belongs to the Section Clinical Laboratory Medicine)
Show Figures

Figure 1

25 pages, 4737 KB  
Article
The Fine Structure of Genome Statistics—The Frequency and Size
by Piotr H. Pawłowski and Piotr Zielenkiewicz
Life 2025, 15(11), 1648; https://doi.org/10.3390/life15111648 - 22 Oct 2025
Viewed by 311
Abstract
A determination and mathematical analysis of the statistics of gene numbers in genomes was proposed. It establishes sampling ranges and provides an analytical description of the probability density function, which represents the likelihood of the number of genes in sequenced genomes falling within [...] Read more.
A determination and mathematical analysis of the statistics of gene numbers in genomes was proposed. It establishes sampling ranges and provides an analytical description of the probability density function, which represents the likelihood of the number of genes in sequenced genomes falling within a specific range of values. The components of the developed statistical multi-Poissonian model revealed the fundamental mechanisms underlying the evolution of life and identified the specific ranges of their dominant influence. The quantitative relations between the statistics of the number of genes and the genome size were shown. A mathematical model of genome size evolution was proposed, identifying subpopulations of intensive and extensive genes associated with protein-coding genes, pseudogenes, and non-coding genes. Full article
(This article belongs to the Special Issue Feature Papers in Synthetic Biology and Systems Biology 2025)
Show Figures

Figure 1

28 pages, 1946 KB  
Article
Efficient Analysis of the Gompertz–Makeham Theory in Unitary Mode and Its Applications in Petroleum and Mechanical Engineering
by Refah Alotaibi, Hoda Rezk and Ahmed Elshahhat
Axioms 2025, 14(11), 775; https://doi.org/10.3390/axioms14110775 - 22 Oct 2025
Viewed by 152
Abstract
This paper introduces a novel three-parameter probability model, the unit-Gompertz–Makeham (UGM) distribution, designed for modeling bounded data on the unit interval (0,1). By transforming the classical Gompertz–Makeham distribution, we derive a unit-support distribution that flexibly accommodates a wide range of shapes in both [...] Read more.
This paper introduces a novel three-parameter probability model, the unit-Gompertz–Makeham (UGM) distribution, designed for modeling bounded data on the unit interval (0,1). By transforming the classical Gompertz–Makeham distribution, we derive a unit-support distribution that flexibly accommodates a wide range of shapes in both the density and hazard rate functions, including increasing, decreasing, bathtub, and inverted-bathtub forms. The UGM density exhibits rich patterns such as symmetric, unimodal, U-shaped, J-shaped, and uniform-like forms, enhancing its ability to fit real-world bounded data more effectively than many existing models. We provide a thorough mathematical treatment of the UGM distribution, deriving explicit expressions for its quantile function, mode, central and non-central moments, mean residual life, moment-generating function, and order statistics. To facilitate parameter estimation, eight classical techniques, including maximum likelihood, least squares, and Cramér–von Mises methods, are developed and compared via a detailed simulation study assessing their accuracy and robustness under varying sample sizes and parameter settings. The practical relevance and superior performance of the UGM distribution are demonstrated using two real-world engineering datasets, where it outperforms existing bounded models, such as beta, Kumaraswamy, unit-Weibull, unit-gamma, and unit-Birnbaum–Saunders. These results highlight the UGM distribution’s potential as a versatile and powerful tool for modeling bounded data in reliability engineering, quality control, and related fields. Full article
(This article belongs to the Special Issue Advances in the Theory and Applications of Statistical Distributions)
Show Figures

Figure 1

21 pages, 2677 KB  
Article
Compatibility of a Competition Model for Explaining Eye Fixation Durations During Free Viewing
by Carlos M. Gómez, María A. Altahona-Medina, Gabriela Barrera and Elena I. Rodriguez-Martínez
Entropy 2025, 27(10), 1079; https://doi.org/10.3390/e27101079 - 18 Oct 2025
Viewed by 245
Abstract
Inter-saccadic times or eye fixation durations (EFDs) are relatively stable at around 250 ms, equivalent to four saccades per second. However, the mean and standard deviation are not sufficient to describe the frequency histogram distribution of EFD. The exGaussian has been proposed for [...] Read more.
Inter-saccadic times or eye fixation durations (EFDs) are relatively stable at around 250 ms, equivalent to four saccades per second. However, the mean and standard deviation are not sufficient to describe the frequency histogram distribution of EFD. The exGaussian has been proposed for fitting the EFD histograms. The present report tries to adjust a competition model (C model) between the saccadic and the fixation network to the EFD histograms. This model is at a rather conceptual level (computational level in Marr’s classification). Both models were adjusted to EFD from an open database with data of 179,473 eye fixations. The C model showed to be able, along with exGaussian model, to be compatible with explaining the EFD distributions. The two parameters of the C model can be ascribed to (i) a refractory period for new saccades modeled by a sigmoid equation (A parameter), while (ii) the ps parameter would be related to the continuous competition between the saccadic network related to the saliency map and the eye fixation network, and would be modeled through a geometric probability density function. The model suggests that competition between neural networks would be an organizational property of brain neural networks to facilitate the decision process for action and perception. In the visual scene scanning, the C model dynamic justifies the early post-saccadic stability of the foveated image, and the subsequent exploration of a broad space in the observed image. The code to extract the data and to run the model is added in the Supplementary Materials. Additionally, entropy of EFD is reported. Full article
(This article belongs to the Special Issue Dynamics in Biological and Social Networks)
Show Figures

Figure 1

25 pages, 4981 KB  
Article
Environmental Context Indicator for Evaluating Quality of GNSS Observation Environment Using Android Smartphone
by Bong-Gyu Park, Miso Kim, Jong-Sung Lee and Kwan-Dong Park
Sensors 2025, 25(20), 6452; https://doi.org/10.3390/s25206452 - 18 Oct 2025
Viewed by 265
Abstract
With location-based services becoming more common, smartphone global navigation satellite systems (GNSS) have begun to play a significant role in daily life. Providing reliable location information to smartphone users requires considering localization uncertainty, which varies with the surrounding environment. In this study, we [...] Read more.
With location-based services becoming more common, smartphone global navigation satellite systems (GNSS) have begun to play a significant role in daily life. Providing reliable location information to smartphone users requires considering localization uncertainty, which varies with the surrounding environment. In this study, we developed an environmental context indicator (ECI) to provide interpretable, continuous information on GNSS observation quality using carrier-to-noise density ratio (C/N0), the number of visible satellites, and positional dilution of precision (PDOP). The ECI was developed using a Samsung Galaxy S21+ and satellite signals from global positioning system (GPS) L1/L5, Galileo E1/E5, and BeiDou B1, consisting of three components: a real-valued indicator ranging from 0 to 6, an integer-valued indicator ranging from 1 to 5, and a probability density ratio representing the reliability of the integer-valued indicator. In experimental results, the ECI reflected the variations in the observation environment and corresponding quality changes. ECI values were lowest in open areas, increasing when approaching an urban area, and reaching their maximum in indoor environments where signal reception is severely limited. Consequently, ECI was influenced by building density, exhibiting large and frequent changes, particularly in urban areas. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

Back to TopTop