Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (368)

Search Parameters:
Keywords = default prediction

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 1977 KB  
Article
Explainable Prediction of UHPC Tensile Strength Using Machine Learning with Engineered Features and Multi-Algorithm Comparative Evaluation
by Zhe Zhang, Tianqin Zeng, Yongge Zeng and Ping Zhu
Buildings 2025, 15(17), 3217; https://doi.org/10.3390/buildings15173217 (registering DOI) - 6 Sep 2025
Abstract
To explore a direct predictive model for the tensile strength of ultra-high-performance concrete (UHPC), machine learning (ML) algorithms are presented. Initially, a database comprising 178 samples of UHPC tensile strength with varying parameters is established. Then, feature engineering strategies are proposed to optimize [...] Read more.
To explore a direct predictive model for the tensile strength of ultra-high-performance concrete (UHPC), machine learning (ML) algorithms are presented. Initially, a database comprising 178 samples of UHPC tensile strength with varying parameters is established. Then, feature engineering strategies are proposed to optimize the robustness of ML models under a small-sample condition. Further, the performance and efficiency of algorithms are compared under default hyperparameters and hyperparameter tuning, respectively. Moreover, the utilization of SHapley Additive exPlanations (SHAP) enables the analysis of the relationships between UHPC tensile strength and its influencing factors. The quantitative analysis results indicate that ensemble algorithms exhibit superior performance, indicated by R² values of above 0.92, under default hyperparameters. After hyperparameter tuning, both conventional and ensemble models achieve R² values exceeding 0.94. However, Bayesian ridge regression (BRR) consistently demonstrates a suboptimal performance, irrespective of hyperparameter tuning. Notably, Categorical Boosting (CatBoost) requires a substantial duration of 1208 s, which is notably more time-consuming than that of other algorithms. The most influential feature identified is fiber reinforcement index with a contribution of 37.5%, followed by the water-to-cement ratio, strain rate, and cross-sectional size. The nonlinear relationship between UHPC tensile strength and the top four factors is visualized, and the critical thresholds are identified. Full article
(This article belongs to the Special Issue Research on Structural Analysis and Design of Civil Structures)
29 pages, 1158 KB  
Article
Financial Systemic Risk and the COVID-19 Pandemic
by Xin Huang
Risks 2025, 13(9), 169; https://doi.org/10.3390/risks13090169 - 4 Sep 2025
Viewed by 78
Abstract
The COVID-19 pandemic has caused market turmoil and economic distress. To understand the effect of the pandemic on the U.S. financial systemic risk, we analyze the explanatory power of detailed COVID-19 data on three market-based systemic risk measures (SRMs): Conditional Value at Risk, [...] Read more.
The COVID-19 pandemic has caused market turmoil and economic distress. To understand the effect of the pandemic on the U.S. financial systemic risk, we analyze the explanatory power of detailed COVID-19 data on three market-based systemic risk measures (SRMs): Conditional Value at Risk, Distress Insurance Premium, and SRISK. In the time-series dimension, we use the Dynamic OLS model and find that financial variables, such as credit default swap spreads, equity correlation, and firm size, significantly affect the SRMs, but the COVID-19 variables do not appear to drive the SRMs. However, if we focus on the first wave of the COVID-19 pandemic in March 2020, we find a positive and significant COVID-19 effect, especially before the government interventions. In the cross-sectional dimension, we run fixed-effect and event-study regressions with clustered variance-covariance matrices. We find that market capitalization helps to reduce a firm’s contribution to the SRMs, while firm size significantly predicts the surge in a firm’s SRM contribution when the pandemic first hits the system. The policy implications include that proper market interventions can help to mitigate the negative pandemic effect, and policymakers should continue the current regulation of required capital holding and consider size when designating systemically important financial institutions. Full article
32 pages, 1239 KB  
Article
Research on a GA-XGBoost and LSTM-Based Green Material Selection Model for Ancient Building Renovation
by Yingfeng Kuang, Xiaolong Chen, Hongfeng Zhang and Cora Un In Wong
Buildings 2025, 15(17), 3094; https://doi.org/10.3390/buildings15173094 - 28 Aug 2025
Viewed by 345
Abstract
This study aims to address the challenge of balancing historical preservation and sustainable material selection in ancient building renovations, particularly in regions with unique climatic conditions like Hunan Province. The research proposes a hybrid model integrating Genetic Algorithm-optimized Extreme Gradient Boosting (GA-XGBoost) and [...] Read more.
This study aims to address the challenge of balancing historical preservation and sustainable material selection in ancient building renovations, particularly in regions with unique climatic conditions like Hunan Province. The research proposes a hybrid model integrating Genetic Algorithm-optimized Extreme Gradient Boosting (GA-XGBoost) and Long Short-Term Memory (LSTM) networks. The GA-XGBoost component optimizes hyperparameters to predict material performance, while the LSTM network captures temporal dependencies in environmental and material degradation data. A multi-objective optimization framework is developed to simultaneously prioritize preservation integrity and green performance. The methodology is validated through a case study on an ancient architectural complex in Rucheng, Hunan Province. Key results demonstrate that the hybrid model achieves superior accuracy in material selection, with an 18–23% reduction in embodied energy (compared to conventional AHP-TOPSIS methods) and a 21.9% improvement in prediction accuracy (versus standalone XGBoost with default hyperparameters). A multi-objective optimization framework is developed to simultaneously prioritize preservation integrity and green performance, with Pareto-optimal solutions identifying material combinations that balance historical authenticity (achieving 92% substrate compatibility) with substantial sustainability gains (18–23% embodied energy reduction). The model also identifies optimal material combinations, such as lime-pozzolan mortars with rice husk ash additives, which enhance moisture buffering capacity by 28% (relative to traditional lime mortar benchmarks) while maintaining 92% compatibility with original substrates (based on ASTM C270 compatibility tests). The findings highlight the model’s effectiveness in bridging heritage conservation and modern sustainability requirements. The study contributes a scalable and interpretable framework for green material selection, offering practical implications for cultural heritage projects worldwide. Future research directions include expanding the model’s applicability to other climate zones and integrating circular economy principles for broader sustainability impact. Preliminary analysis indicates the framework’s adaptability to other climate zones through adjustment of key material property weightings. Full article
(This article belongs to the Section Building Materials, and Repair & Renovation)
Show Figures

Figure 1

27 pages, 3001 KB  
Article
Effects of Civil Wars on the Financial Soundness of Banks: Evidence from Sudan Using Altman’s Models and Stress Testing
by Mudathir Abuelgasim and Said Toumi
J. Risk Financial Manag. 2025, 18(9), 476; https://doi.org/10.3390/jrfm18090476 - 26 Aug 2025
Viewed by 539
Abstract
This study assesses the financial soundness of Sudanese commercial banks during escalating civil conflict by integrating Altman’s Z-score models with scenario-based stress testing. Using audited financial data from 2016 to 2022 (pre-war) and projections through to 2028, the analysis evaluates resilience under low- [...] Read more.
This study assesses the financial soundness of Sudanese commercial banks during escalating civil conflict by integrating Altman’s Z-score models with scenario-based stress testing. Using audited financial data from 2016 to 2022 (pre-war) and projections through to 2028, the analysis evaluates resilience under low- and high-intensity conflict scenarios. Altman’s Model 3 (for non-industrial firms) and Model 4 (for emerging markets) are applied to capture liquidity, retained earnings, profitability, and leverage dynamics. The findings reveal relative stability between 2017–2020 and in 2022, contrasted by significant vulnerability in 2016 and 2021 due to macroeconomic deterioration, sanctions, and political instability. Liquidity emerged as the most critical driver of Z-score performance, followed by earnings retention and profitability, while leverage showed a context-specific positive effect under Sudan’s Islamic finance framework. Stress testing indicates that even under low-intensity conflict, rising liquidity risk, capital erosion, and credit risk threaten sectoral stability by 2025. High-intensity conflict projections suggest systemic collapse by 2028, characterized by unsustainable liquidity depletion, near-zero capital adequacy, and widespread defaults. The results demonstrate a direct relationship between conflict duration and systemic fragility, affirming the predictive value of Altman’s models when combined with stress testing. Policy implications include the urgent need for enhanced risk-based supervision, Basel II/III implementation, crisis reserves, contingency planning, and coordinated regulatory interventions to safeguard the stability of the banking sector in fragile states. Full article
(This article belongs to the Section Banking and Finance)
Show Figures

Figure 1

23 pages, 598 KB  
Article
The Good, the Bad, and the Bankrupt: A Super-Efficiency DEA and LASSO Approach Predicting Corporate Failure
by Ioannis Dokas, George Geronikolaou, Sofia Katsimardou and Eleftherios Spyromitros
J. Risk Financial Manag. 2025, 18(9), 471; https://doi.org/10.3390/jrfm18090471 - 24 Aug 2025
Viewed by 404
Abstract
Corporate failure prediction remains a major topic in the literature. Numerous methodologies have been established for its assessment, while data envelopment analysis (DEA) has received particular attention. This study contributes to the literature, establishing a new approach in the construction process of prediction [...] Read more.
Corporate failure prediction remains a major topic in the literature. Numerous methodologies have been established for its assessment, while data envelopment analysis (DEA) has received particular attention. This study contributes to the literature, establishing a new approach in the construction process of prediction models based on the combination of logistic LASSO and an advanced version of data envelopment analysis (DEA). We adopt the modified slacks-based super-efficiency measure (modified super-SBM-DEA), following the “Worst practice frontier” approach, and focus on the selection process of predictive variables, implementing the logistic LASSO regression. A balanced sample with one-to-one matching between forty-five firms that filed for reorganization under U.S. bankruptcy law during the period 2014–2020 and forty-five non-failed firms of a similar size from the U.S. energy economic sector has been used for the empirical analysis. The proposed methodology offers superior results in terms of corporate failure prediction accuracy. For the dynamic assessment of failure, Malmquist DEA has been implemented during the five fiscal years prior to the event of failure, offering insights into financial distress before the event of a default. The model outperforms alternatives by achieving higher overall prediction accuracy (85.6%), the better identification of failed firms (91.1%), and the improved classification of non-failed firms (80%). Compared to prior DEA-based models, it demonstrates superior predictive performance with lower Type I and Type II errors and higher sensitivity as well as specificity. These results highlight the model’s effectiveness as a reliable early warning tool for bankruptcy prediction. Full article
Show Figures

Figure 1

19 pages, 1612 KB  
Article
Listening for Region: Phonetic Cue Sensitivity and Sociolinguistic Development in L2 Spanish
by Lauren B. Schmidt
Languages 2025, 10(8), 198; https://doi.org/10.3390/languages10080198 - 20 Aug 2025
Viewed by 511
Abstract
This study investigates how second language (L2) learners of Spanish identify the regional origin of native Spanish speakers and whether specific phonetic cues predict dialect identification accuracy across proficiency levels. Situated within a growing body of work on sociolinguistic competence, this research addresses [...] Read more.
This study investigates how second language (L2) learners of Spanish identify the regional origin of native Spanish speakers and whether specific phonetic cues predict dialect identification accuracy across proficiency levels. Situated within a growing body of work on sociolinguistic competence, this research addresses the development of learners’ ability to use linguistic forms not only for communication but also for social interpretation. A dialect identification task was administered to 111 American English-speaking learners of Spanish and 19 native Spanish speakers. Participants heard sentence-length stimuli targeting regional phonetic features and selected the speaker’s country of origin. While L2 learners were able to identify regional dialects above chance, accuracy was low and significantly below that of native speakers. Higher-proficiency learners demonstrated improved identification, especially for speakers from Spain and Argentina, and relied more on salient phonetic cues (e.g., [θ], [ʃ]). No significant development was found for identification of Mexican or Puerto Rican varieties. Unlike native speakers, L2 learners did not show sensitivity to broader macrodialect groupings; instead, they frequently defaulted to high-exposure varieties (e.g., Spain, Mexico) regardless of the phonetic cues present. Findings suggest that sociophonetic perception in L2 Spanish develops gradually and unevenly, shaped by cue salience and exposure. Full article
(This article belongs to the Special Issue Second Language Acquisition and Sociolinguistic Studies)
Show Figures

Figure 1

22 pages, 1833 KB  
Article
Survival Analysis for Credit Risk: A Dynamic Approach for Basel IRB Compliance
by Fernando L. Dala, Manuel L. Esquível and Raquel M. Gaspar
Risks 2025, 13(8), 155; https://doi.org/10.3390/risks13080155 - 15 Aug 2025
Viewed by 424
Abstract
This paper uses survival analysis as a tool to assess credit risk in loan portfolios within the framework of the Basel Internal Ratings-Based (IRB) approach. By modeling the time to default using survival functions, the methodology allows for the estimation of default probabilities [...] Read more.
This paper uses survival analysis as a tool to assess credit risk in loan portfolios within the framework of the Basel Internal Ratings-Based (IRB) approach. By modeling the time to default using survival functions, the methodology allows for the estimation of default probabilities and the dynamic evaluation of portfolio performance. The model explicitly accounts for right censoring and demonstrates strong predictive accuracy. Furthermore, by incorporating additional information about the portfolio’s loss process, we show how to empirically estimate key risk measures—such as Value at Risk (VaR) and Expected Shortfall (ES)—that are sensitive to the age of the loans. Through simulations, we illustrate how loss distributions and the corresponding risk measures evolve over the loans’ life cycles. Our approach emphasizes the significant dependence of risk metrics on loan age, illustrating that risk profiles are inherently dynamic rather than static. Using a real-world dataset of 10,479 loans issued by Angolan commercial banks, combined with assumptions regarding loss processes, we demonstrate the practical applicability of the proposed methodology. This approach is particularly relevant for emerging markets with limited access to advanced credit risk modeling infrastructure. Full article
(This article belongs to the Special Issue Advances in Risk Models and Actuarial Science)
Show Figures

Figure 1

15 pages, 1001 KB  
Article
Do Fintech Firms Excel in Risk Assessment for U.S. 30-Year Conforming Residential Mortgages?
by Zilong Liu and Hongyan Liang
FinTech 2025, 4(3), 42; https://doi.org/10.3390/fintech4030042 - 14 Aug 2025
Viewed by 341
Abstract
This study examines whether fintech lenders outperform traditional banks and non-fintech non-banks in risk assessment for U.S. 30-year fixed-rate conforming mortgages. Analyzing Fannie Mae and Freddie Mac loans from Q1 2012 to Q1 2020 using ROC/AUC and risk-pricing regressions, we find fintech lenders [...] Read more.
This study examines whether fintech lenders outperform traditional banks and non-fintech non-banks in risk assessment for U.S. 30-year fixed-rate conforming mortgages. Analyzing Fannie Mae and Freddie Mac loans from Q1 2012 to Q1 2020 using ROC/AUC and risk-pricing regressions, we find fintech lenders have lower predictive accuracy and pricing misalignment, charging higher rates to borrowers who remain current and lower rates to those who default or prepay. These results indicate that conforming mortgage regulations and rapid loan sales to government-sponsored enterprises (GSEs) diminish fintech firms’ incentives for enhanced borrower screening, thus reducing their risk assessment effectiveness. Full article
Show Figures

Figure 1

14 pages, 557 KB  
Review
Advances in Kidney Transplant, Machine Perfusion, and Viability Markers
by Stephanie Y. Ohara, Mariana Chavez-Villa, Shennen Mao, Jacob Clendenon, Julie Heimbach, Randi Ryan, Lavanya Kodali, Michelle C. Nguyen, Rafael Nateras-Nunez and Caroline C. Jadlowiec
Kidney Dial. 2025, 5(3), 37; https://doi.org/10.3390/kidneydial5030037 - 14 Aug 2025
Viewed by 661
Abstract
Despite improvements in kidney transplantation rates, the shortage of donor kidneys remains a critical issue, exacerbated by non-utilization of recovered kidneys due to quality concerns, necessitating advancements in perfusion methods to enhance graft outcomes and usage. Although static cold storage remains the default [...] Read more.
Despite improvements in kidney transplantation rates, the shortage of donor kidneys remains a critical issue, exacerbated by non-utilization of recovered kidneys due to quality concerns, necessitating advancements in perfusion methods to enhance graft outcomes and usage. Although static cold storage remains the default standard for kidney preservation, newer methods like hypothermic machine perfusion have shown improved outcomes, including reduced delayed graft function and better survival rates. Hypothermic oxygenated machine perfusion and normothermic machine perfusion offer some potential clinical benefits but studies to date have demonstrated mixed results. In the United States, LifePort and the XVIVO’s Kidney Assist Transport are the most popular hypothermic perfusion devices, with NMP devices mostly in trials. Combining perfusion with biomarkers such as mitochondrial flavin mononucleotide, neutrophil gelatinase-associated lipocalin, and osteopontin shows promise in assessing kidney viability and predicting post-transplant outcomes, though further research is also needed. Emphasis on repair biomarkers, such as uromodulin and osteopontin, aims to better predict graft outcomes and develop new therapies. While notable advancements have been made in the use of machine perfusion and viability testing for liver transplantation, additional research with larger sample sizes is essential to substantiate these results and enhance kidney transplantation outcomes. Full article
Show Figures

Graphical abstract

20 pages, 639 KB  
Article
AI-Powered Reduced-Form Model for Default Rate Forecasting
by Jacopo Giacomelli
Risks 2025, 13(8), 151; https://doi.org/10.3390/risks13080151 - 13 Aug 2025
Viewed by 427
Abstract
This study aims to combine deep and recurrent neural networks with a reduced-form portfolio model to predict future default rates across economic sectors. The industry-specific forecasts for Italian default rates produced with the proposed approach demonstrate its effectiveness, achieving significant levels of explained [...] Read more.
This study aims to combine deep and recurrent neural networks with a reduced-form portfolio model to predict future default rates across economic sectors. The industry-specific forecasts for Italian default rates produced with the proposed approach demonstrate its effectiveness, achieving significant levels of explained variance. The results obtained show that enhancing a reduced-form model by integrating it with neural networks is possible and practical for multivariate forecasting of future default frequencies. In our analysis, we utilize the recently proposed RecessionRisk+, a reduced-form latent-factor model developed for default and recession risk management applications as an improvement of the well-known CreditRisk+ model. The model has been empirically verified to exhibit some predictive power concerning future default rates. However, the theoretical framework underlying the model does not provide the elements necessary to define a proper estimator for forecasting the target default rates, leaving space for the application of a neural network framework to retrieve the latent information useful for default rate forecasting purposes. Among the neural network models tested in combination with RecessionRisk+, the best results are obtained with shallow LSTM networks. Full article
Show Figures

Figure 1

30 pages, 2261 KB  
Article
Multilayer Perceptron Mapping of Subjective Time Duration onto Mental Imagery Vividness and Underlying Brain Dynamics: A Neural Cognitive Modeling Approach
by Matthew Sheculski and Amedeo D’Angiulli
Mach. Learn. Knowl. Extr. 2025, 7(3), 82; https://doi.org/10.3390/make7030082 - 13 Aug 2025
Viewed by 540
Abstract
According to a recent experimental phenomenology–information processing theory, the sensory strength, or vividness, of visual mental images self-reported by human observers reflects the intensive variation in subjective time duration during the process of generation of said mental imagery. The primary objective of this [...] Read more.
According to a recent experimental phenomenology–information processing theory, the sensory strength, or vividness, of visual mental images self-reported by human observers reflects the intensive variation in subjective time duration during the process of generation of said mental imagery. The primary objective of this study was to test the hypothesis that a biologically plausible essential multilayer perceptron (MLP) architecture can validly map the phenomenological categories of subjective time duration onto levels of subjectively self-reported vividness. A secondary objective was to explore whether this type of neural network cognitive modeling approach can give insight into plausible underlying large-scale brain dynamics. To achieve these objectives, vividness self-reports and reaction times from a previously collected database were reanalyzed using multilayered perceptron network models. The input layer consisted of six levels representing vividness self-reports and a reaction time cofactor. A single hidden layer consisted of three nodes representing the salience, task positive, and default mode networks. The output layer consisted of five levels representing Vittorio Benussi’s subjective time categories. Across different models of networks, Benussi’s subjective time categories (Level 1 = very brief, 2 = brief, 3 = present, 4 = long, 5 = very long) were predicted by visual imagery vividness level 1 (=no image) to 5 (=very vivid) with over 90% success in classification accuracy, precision, recall, and F1-score. This accuracy level was maintained after 5-fold cross validation. Linear regressions, Welch’s t-test for independent coefficients, and Pearson’s correlation analysis were applied to the resulting hidden node weight vectors, obtaining evidence for strong correlation and anticorrelation between nodes. This study successfully mapped Benussi’s five levels of subjective time categories onto the activation patterns of a simple MLP, providing a novel computational framework for experimental phenomenology. Our results revealed structured, complex dynamics between the task positive network (TPN), the default mode network (DMN), and the salience network (SN), suggesting that the neural mechanisms underlying temporal consciousness involve flexible network interactions beyond the traditional triple network model. Full article
Show Figures

Graphical abstract

18 pages, 2279 KB  
Article
MvAl-MFP: A Multi-Label Classification Method on the Functions of Peptides with Multi-View Active Learning
by Yuxuan Peng, Jicong Duan, Yuanyuan Dan and Hualong Yu
Curr. Issues Mol. Biol. 2025, 47(8), 628; https://doi.org/10.3390/cimb47080628 - 6 Aug 2025
Viewed by 414
Abstract
The rapid expansion of peptide libraries and the increasing functional diversity of peptides have highlighted the significance of predicting the multifunctional properties of peptides in bioinformatics research. Although supervised learning methods have made advancements, they typically necessitate substantial amounts of labeled data for [...] Read more.
The rapid expansion of peptide libraries and the increasing functional diversity of peptides have highlighted the significance of predicting the multifunctional properties of peptides in bioinformatics research. Although supervised learning methods have made advancements, they typically necessitate substantial amounts of labeled data for yielding accurate prediction. This study presents MvAl-MFP, a multi-label active learning approach that incorporates multiple feature views of peptides. This method takes advantage of the natural properties of multi-view representation for amino acid sequences, meets the requirement of the query-by-committee (QBC) active learning paradigm, and further significantly diminishes the requirement for labeled samples while training high-performing models. First, MvAl-MFP generates nine distinct feature views for a few labeled peptide amino acid sequences by considering various peptide characteristics, including amino acid composition, physicochemical properties, evolutionary information, etc. Then, on each independent view, a multi-label classifier is trained based on the labeled samples. Next, a QBC strategy based on the average entropy of predictions across all trained classifiers is adopted to select a specific number of most valuable unlabeled samples to submit them to human experts for labeling by wet-lab experiments. Finally, the aforementioned procedure is iteratively conducted with a constantly expanding labeled set and updating classifiers until it meets the default stopping criterion. The experiments are conducted on a dataset of multifunctional therapeutic peptides annotated with eight functional labels, including anti-bacterial properties, anti-inflammatory properties, anti-cancer properties, etc. The results clearly demonstrate the superiority of the proposed MvAl-MFP method, as it can rapidly improve prediction performance while only labeling a small number of samples. It provides an effective tool for more precise multifunctional peptide prediction while lowering the cost of wet-lab experiments. Full article
(This article belongs to the Special Issue Challenges and Advances in Bioinformatics and Computational Biology)
Show Figures

Figure 1

18 pages, 900 KB  
Article
Don’t Pause Me When I Switch: Parsing Effects of Code-Switching
by Marina Sokolova and Jessica Ward
Languages 2025, 10(8), 183; https://doi.org/10.3390/languages10080183 - 29 Jul 2025
Viewed by 336
Abstract
This study investigates the effect of code-switching (CS) on the processing and attachment resolution of ambiguous relative clauses (RCs) like ‘Bill saw the friend of the neighbor that was talking about football’ by heritage speakers of Spanish. It checks whether code-switching imposes a [...] Read more.
This study investigates the effect of code-switching (CS) on the processing and attachment resolution of ambiguous relative clauses (RCs) like ‘Bill saw the friend of the neighbor that was talking about football’ by heritage speakers of Spanish. It checks whether code-switching imposes a prosodic break at the place of language change, and whether this prosodic break affects RC parsing, as predicted by the Implicit Prosody Hypothesis: a high attachment (HA) preference results from a prosodic break at the RC. A prosodic break at the preposition ‘of’ in the complex DP ‘the friend of the neighbor’ entails a low attachment (LA) preference. The design compares RC resolution in unilingual sentences (Spanish, with a default preference for HA in RC, and English, with the default LA) with the RC parsing in sentences with CS. The CS occurs at the places of prosodic breaks considered by the IPH. The results show sensitivity to the place of CS in RC attachment. CS prompting LA causes longer response times. The preference for HA in Spanish unilingual sentences is higher than in English ones. Heritage speakers are sensitive to the prosodic effects of CS. However, there is high variability across speakers. Full article
(This article belongs to the Special Issue Language Processing in Spanish Heritage Speakers)
Show Figures

Figure 1

23 pages, 3864 KB  
Article
Seeing Is Craving: Neural Dynamics of Appetitive Processing During Food-Cue Video Watching and Its Impact on Obesity
by Jinfeng Han, Kaixiang Zhuang, Debo Dong, Shaorui Wang, Feng Zhou, Yan Jiang and Hong Chen
Nutrients 2025, 17(15), 2449; https://doi.org/10.3390/nu17152449 - 27 Jul 2025
Viewed by 662
Abstract
Background/Objectives: Digital food-related videos significantly influence cravings, appetite, and weight outcomes; however, the dynamic neural mechanisms underlying appetite fluctuations during naturalistic viewing remain unclear. This study aimed to identify neural activity patterns associated with moment-to-moment appetite changes during naturalistic food-cue video viewing [...] Read more.
Background/Objectives: Digital food-related videos significantly influence cravings, appetite, and weight outcomes; however, the dynamic neural mechanisms underlying appetite fluctuations during naturalistic viewing remain unclear. This study aimed to identify neural activity patterns associated with moment-to-moment appetite changes during naturalistic food-cue video viewing and to examine their relationships with cravings and weight-related outcomes. Methods: Functional magnetic resonance imaging (fMRI) data were collected from 58 healthy female participants as they viewed naturalistic food-cue videos. Participants concurrently provided continuous ratings of their appetite levels throughout video viewing. Hidden Markov Modeling (HMM), combined with machine learning regression techniques, was employed to identify distinct neural states reflecting dynamic appetite fluctuations. Findings were independently validated using a shorter-duration food-cue video viewing task. Results: Distinct neural states characterized by heightened activation in default mode and frontoparietal networks consistently corresponded with increases in appetite ratings. Importantly, the higher expression of these appetite-related neural states correlated positively with participants’ Body Mass Index (BMI) and post-viewing food cravings. Furthermore, these neural states mediated the relationship between BMI and food craving levels. Longitudinal analyses revealed that the expression levels of appetite-related neural states predicted participants’ BMI trajectories over a subsequent six-month period. Participants experiencing BMI increases exhibited a significantly greater expression of these neural states compared to those whose BMI remained stable. Conclusions: Our findings elucidate how digital food cues dynamically modulate neural processes associated with appetite. These neural markers may serve as early indicators of obesity risk, offering valuable insights into the psychological and neurobiological mechanisms linking everyday media exposure to food cravings and weight management. Full article
(This article belongs to the Section Nutrition and Obesity)
Show Figures

Figure 1

32 pages, 7115 KB  
Article
Advancing Knowledge on Machine Learning Algorithms for Predicting Childhood Vaccination Defaulters in Ghana: A Comparative Performance Analysis
by Eliezer Ofori Odei-Lartey, Stephaney Gyaase, Dominic Asamoah, Thomas Gyan, Kwaku Poku Asante and Michael Asante
Appl. Sci. 2025, 15(15), 8198; https://doi.org/10.3390/app15158198 - 23 Jul 2025
Viewed by 513
Abstract
High rates of childhood vaccination defaulting remain a significant barrier to achieving full vaccination coverage in sub-Saharan Africa, contributing to preventable morbidity and mortality. This study evaluated the utility of machine learning algorithms for predicting childhood vaccination defaulters in Ghana, addressing the limitations [...] Read more.
High rates of childhood vaccination defaulting remain a significant barrier to achieving full vaccination coverage in sub-Saharan Africa, contributing to preventable morbidity and mortality. This study evaluated the utility of machine learning algorithms for predicting childhood vaccination defaulters in Ghana, addressing the limitations of traditional statistical methods when handling complex, high-dimensional health data. Using a merged dataset from two malaria vaccine pilot surveys, we engineered novel temporal features, including vaccination timing windows and birth seasonality. Six algorithms, namely logistic regression, support vector machine, random forest, gradient boosting machine, extreme gradient boosting, and artificial neural networks, were compared. Models were trained and validated on both original and synthetically balanced and augmented data. The results showed higher performance across the ensemble tree classifiers. The random forest and extreme gradient boosting models reported the highest F1 scores (0.92) and AUCs (0.95) on augmented unseen data. The key predictors identified include timely receipt of birth and week six vaccines, the child’s age, household wealth index, and maternal education. The findings demonstrate that robust machine learning frameworks, combined with temporal and contextual feature engineering, can improve defaulter risk prediction accuracy. Integrating such models into routine immunization programs could enable data-driven targeting of high-risk groups, supporting policymakers in strategies to close vaccination coverage gaps. Full article
Show Figures

Figure 1

Back to TopTop