Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,446)

Search Parameters:
Keywords = biased estimators

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 753 KB  
Article
Estimating Policy Impact in a Difference-in-Differences Hazard Model: A Simulation Study
by David A. Hsieh
Risks 2025, 13(10), 200; https://doi.org/10.3390/risks13100200 - 13 Oct 2025
Abstract
This article estimates the impact of a policy change on an event probability in a difference-in-differences hazard model using four estimators. We examine the error distributions of the estimators via a simulation experiment with twelve different scenarios. In four simulation scenarios when all [...] Read more.
This article estimates the impact of a policy change on an event probability in a difference-in-differences hazard model using four estimators. We examine the error distributions of the estimators via a simulation experiment with twelve different scenarios. In four simulation scenarios when all relevant variables are known, three of the four methods yield accurate estimates of the policy impact. In eight simulation scenarios when an individual characteristic is unobservable to the researcher, only one method (nonparametric maximum likelihood) achieves accurate estimates of the policy change. The other three methods (standard Cox, three-step Cox, and linear probability) are severely biased. Full article
Show Figures

Figure 1

26 pages, 5244 KB  
Article
Optimizing Spatial Scales for Evaluating High-Resolution CO2 Fossil Fuel Emissions: Multi-Source Data and Machine Learning Approach
by Yujun Fang, Rong Li and Jun Cao
Sustainability 2025, 17(20), 9009; https://doi.org/10.3390/su17209009 (registering DOI) - 11 Oct 2025
Viewed by 51
Abstract
High-resolution CO2 fossil fuel emission data are critical for developing targeted mitigation policies. As a key approach for estimating spatial distributions of CO2 emissions, top–down methods typically rely upon spatial proxies to disaggregate administrative-level emission to finer spatial scales. However, conventional [...] Read more.
High-resolution CO2 fossil fuel emission data are critical for developing targeted mitigation policies. As a key approach for estimating spatial distributions of CO2 emissions, top–down methods typically rely upon spatial proxies to disaggregate administrative-level emission to finer spatial scales. However, conventional linear regression models may fail to capture complex non-linear relationships between proxies and emissions. Furthermore, methods relying on nighttime light data are mostly inadequate in representing emissions for both industrial and rural zones. To address these limitations, this study developed a multiple proxy framework integrating nighttime light, points of interest (POIs), population, road networks, and impervious surface area data. Seven machine learning algorithms—Extra-Trees, Random Forest, XGBoost, CatBoost, Gradient Boosting Decision Trees, LightGBM, and Support Vector Regression—were comprehensively incorporated to estimate high-resolution CO2 fossil fuel emissions. Comprehensive evaluation revealed that the multiple proxy Extra-Trees model significantly outperformed the single-proxy nighttime light linear regression model at the county scale, achieving R2 = 0.96 (RMSE = 0.52 MtCO2) in cross-validation and R2 = 0.92 (RMSE = 0.54 MtCO2) on the independent test set. Feature importance analysis identified brightness of nighttime light (40.70%) and heavy industrial density (21.11%) as the most critical spatial proxies. The proposed approach also showed strong spatial consistency with the Multi-resolution Emission Inventory for China, exhibiting correlation coefficients of 0.82–0.84. This study demonstrates that integrating local multiple proxy data with machine learning corrects spatial biases inherent in traditional top–down approaches, establishing a transferable framework for high-resolution emissions mapping. Full article
Show Figures

Figure 1

13 pages, 1276 KB  
Article
OGK Approach for Accurate Mean Estimation in the Presence of Outliers
by Atef F. Hashem, Abdulrahman Obaid Alshammari, Usman Shahzad and Soofia Iftikhar
Mathematics 2025, 13(20), 3251; https://doi.org/10.3390/math13203251 (registering DOI) - 11 Oct 2025
Viewed by 98
Abstract
This paper proposes a new family of robust estimators of means, depending on the Orthogonalized Gnanadesikan–Kettenring (OGK) covariance matrix. These estimators are computationally feasible and robust replacements of the Minimum Covariance Determinant (MCD) estimator in survey sampling contexts involving auxiliary information. With the [...] Read more.
This paper proposes a new family of robust estimators of means, depending on the Orthogonalized Gnanadesikan–Kettenring (OGK) covariance matrix. These estimators are computationally feasible and robust replacements of the Minimum Covariance Determinant (MCD) estimator in survey sampling contexts involving auxiliary information. With the growing popularity of outliers in environmental data, as in the case of measuring solar radiation, conventional estimators like the sample mean or the Ordinary Least Squares (OLS) regression-based estimators are both biased and unreliable. The suggested OGK-based exponential-type estimators combine robust measures of location and dispersion and have a considerable advantage in the estimation of the population mean when auxiliary variables such as temperature are highly correlated with the variable of interest. The MSE property of OGK-based estimators is also obtained through a detailed theoretical derivation with the expressions of optimal weights. Performance was further proved using real-world and simulated data on solar radiation, as well as by demonstrating lower MSEs and higher PREs in comparison to MCD-based estimators. These results show that OGK-based estimators are highly efficient and robust in actual and artificially contaminated situations and hence are a good option in robust survey sampling and environmental data analysis. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

22 pages, 782 KB  
Review
Deep Mutational Scanning in Immunology: Techniques and Applications
by Chengwei Shao, Siyue Jia, Yue Li and Jingxin Li
Pathogens 2025, 14(10), 1027; https://doi.org/10.3390/pathogens14101027 - 10 Oct 2025
Viewed by 197
Abstract
Mutations may cause changes in the structure and function of immune-related proteins, thereby affecting the operation of the immune system. Deep mutational scanning combines saturation mutagenesis, functional selection, and high-throughput sequencing to evaluate the effects of mutations on a large scale and with [...] Read more.
Mutations may cause changes in the structure and function of immune-related proteins, thereby affecting the operation of the immune system. Deep mutational scanning combines saturation mutagenesis, functional selection, and high-throughput sequencing to evaluate the effects of mutations on a large scale and with high resolution. By systematically and comprehensively analyzing the impact of mutations on the functions of immune-related proteins, the immune response mechanism can be better understood. However, each stage in deep mutation scanning has its limits, and the approach remains constrained in several ways. These include data and selection biases that affect the robustness of effect estimates, insufficient library coverage and editability leading to uneven representation of sites and alleles, system-induced biased signals that deviate phenotypes from their true physiological state, and imperfect models and statistical processing that limit extrapolation capabilities. Therefore, this technology still needs further development. Herein, we summarize the principles and methods of deep mutational scanning and discuss its application in immunological research. The aim is to provide insights into the broader application prospects of deep mutational scanning technology in immunology. Full article
Show Figures

Figure 1

30 pages, 515 KB  
Article
Symmetric Positive Semi-Definite Fourier Estimator of Spot Covariance Matrix with High Frequency Data
by Jiro Akahori, Reika Kambara, Nien-Lin Liu, Maria Elvira Mancino, Tommaso Mariotti and Yukie Yasuda
Risks 2025, 13(10), 197; https://doi.org/10.3390/risks13100197 - 9 Oct 2025
Viewed by 165
Abstract
This paper proposes a nonparametric estimator of the spot volatility matrix with high-frequency data. Our newly proposed Positive Definite Fourier (PDF) estimator produces symmetric positive semi-definite estimates and is consistent with a suitable choice of the localizing kernel. The PDF estimator is based [...] Read more.
This paper proposes a nonparametric estimator of the spot volatility matrix with high-frequency data. Our newly proposed Positive Definite Fourier (PDF) estimator produces symmetric positive semi-definite estimates and is consistent with a suitable choice of the localizing kernel. The PDF estimator is based on a modification of the Fourier estimation method introduced by Malliavin and Mancino. The estimator has two parameters: the frequency N, which controls the biases due to the asynchronicity effect and the market microstructure noise effect; and the localization parameter M for the employed Gaussian kernel. The sensitivity of the PDF estimator to the choice of these two parameters is studied in a simulated environment. The accuracy and the ability of the estimator to produce positive semi-definite covariance matrices are evaluated by an extensive numerical analysis, against competing estimators present in the literature. The results of the simulations are confirmed under different scenarios, including the dimensionality of the problem, the asynchronicity of data, and several different specifications of the market microstructure noise. The computational time required by the estimator and the stability of estimation are also tested with empirical data. Full article
(This article belongs to the Special Issue Integrating New Risks into Traditional Risk Management)
41 pages, 4705 KB  
Article
Full-Cycle Evaluation of Multi-Source Precipitation Products for Hydrological Applications in the Magat River Basin, Philippines
by Jerome G. Gacu, Sameh Ahmed Kantoush and Binh Quang Nguyen
Remote Sens. 2025, 17(19), 3375; https://doi.org/10.3390/rs17193375 - 7 Oct 2025
Viewed by 252
Abstract
Satellite Precipitation Products (SPPs) play a crucial role in hydrological modeling, particularly in data-scarce and climate-sensitive basins such as the Magat River Basin (MRB), Philippines—one of Southeast Asia’s most typhoon-prone and infrastructure-critical watersheds. This study presents the first full-cycle evaluation of nine widely [...] Read more.
Satellite Precipitation Products (SPPs) play a crucial role in hydrological modeling, particularly in data-scarce and climate-sensitive basins such as the Magat River Basin (MRB), Philippines—one of Southeast Asia’s most typhoon-prone and infrastructure-critical watersheds. This study presents the first full-cycle evaluation of nine widely used multi-source precipitation products (2000–2024), integrating raw validation against rain gauge observations, bias correction using quantile mapping, and post-correction re-ranking through an Entropy Weight Method–TOPSIS multi-criteria decision analysis (MCDA). Before correction, SM2RAIN-ASCAT demonstrated the strongest statistical performance, while CHIRPS and ClimGridPh-RR exhibited robust detection skills and spatial consistency. Following bias correction, substantial improvements were observed across all products, with CHIRPS markedly reducing systematic errors and ClimGridPh-RR showing enhanced correlation and volume reliability. Biases were decreased significantly, highlighting the effectiveness of quantile mapping in improving both seasonal and annual precipitation estimates. Beyond conventional validation, this framework explicitly aligns SPP evaluation with four critical hydrological applications: flood detection, drought monitoring, sediment yield modeling, and water balance estimation. The analysis revealed that SM2RAIN-ASCAT is most suitable for monitoring seasonal drought and dry periods, CHIRPS excels in detecting high-intensity and erosive rainfall events, and ClimGridPh-RR offers the most consistent long-term volume-based estimates. By integrating validation, correction, and application-specific ranking, this study provides a replicable blueprint for operational SPP assessment in monsoon-dominated, data-limited basins. The findings underscore the importance of tailoring product selection to hydrological purposes, supporting improved flood early warning, drought preparedness, sediment management, and water resources governance under intensifying climatic extremes. Full article
Show Figures

Figure 1

32 pages, 4143 KB  
Article
Aspects of Biology and Machine Learning for Age Prediction in the Large-Eye Dentex Dentex macrophthalmus (Bloch, 1791)
by Dimitris Klaoudatos, Alexandros Theocharis, Chrysoula Vardaki, Elpida Pachi, Dimitris Politikos and Alexis Conides
Fishes 2025, 10(10), 500; https://doi.org/10.3390/fishes10100500 - 6 Oct 2025
Viewed by 338
Abstract
The large-eye dentex (Dentex macrophthalmus) is a relatively small sparid fish with increasing potential as a supplementary fishery resource in the Mediterranean Sea, particularly as traditional stocks face overexploitation. Despite its widespread distribution, biological data on this species, especially from Greek [...] Read more.
The large-eye dentex (Dentex macrophthalmus) is a relatively small sparid fish with increasing potential as a supplementary fishery resource in the Mediterranean Sea, particularly as traditional stocks face overexploitation. Despite its widespread distribution, biological data on this species, especially from Greek waters, remain scarce. This study presents the first comprehensive biological assessment of D. macrophthalmus in the Pagasitikos Gulf, focusing on population structure, growth, mortality, and the application of machine learning (ML) for age prediction. A total of 305 individuals were collected, revealing a female-biased sex ratio and negative allometric growth in both somatic and otolith dimensions. The von Bertalanffy growth parameters indicated a slow growth rate (k = 0.16 year−1), with an estimated asymptotic length (L∞) of 25.97 cm. The population was found to be underexploited (E = 0.41), suggesting resilience to current fishing pressure. Stepwise regression and ML models were employed to predict age from otolith morphometrics. A linear model identified otolith weight and aspect ratio as the most significant predictors of age (R2 = 0.8). Among the ML algorithms tested, the Neural Network model achieved the highest performance (R2 = 0.764, MAPE = 14.10%), demonstrating its potential for accurate and efficient age estimation. These findings provide crucial baseline data for the sustainable management of D. macrophthalmus and highlight the value of integrating advanced ML techniques into fisheries biology. Full article
(This article belongs to the Section Biology and Ecology)
Show Figures

Graphical abstract

30 pages, 1778 KB  
Article
AI, Ethics, and Cognitive Bias: An LLM-Based Synthetic Simulation for Education and Research
by Ana Luize Bertoncini, Raul Matsushita and Sergio Da Silva
AI Educ. 2026, 1(1), 3; https://doi.org/10.3390/aieduc1010003 - 4 Oct 2025
Viewed by 587
Abstract
This study examines how cognitive biases may shape ethical decision-making in AI-mediated environments, particularly within education and research. As AI tools increasingly influence human judgment, biases such as normalization, complacency, rationalization, and authority bias can lead to ethical lapses, including academic misconduct, uncritical [...] Read more.
This study examines how cognitive biases may shape ethical decision-making in AI-mediated environments, particularly within education and research. As AI tools increasingly influence human judgment, biases such as normalization, complacency, rationalization, and authority bias can lead to ethical lapses, including academic misconduct, uncritical reliance on AI-generated content, and acceptance of misinformation. To explore these dynamics, we developed an LLM-generated synthetic behavior estimation framework that modeled six decision-making scenarios with probabilistic representations of key cognitive biases. The scenarios addressed issues ranging from loss of human agency to biased evaluations and homogenization of thought. Statistical summaries of the synthetic dataset indicated that 71% of agents engaged in unethical behavior influenced by biases like normalization and complacency, 78% relied on AI outputs without scrutiny due to automation and authority biases, and misinformation was accepted in 65% of cases, largely driven by projection and authority biases. These statistics are descriptive of this synthetic dataset only and are not intended as inferential claims about real-world populations. The findings nevertheless suggest the potential value of targeted interventions—such as AI literacy programs, systematic bias audits, and equitable access to AI tools—to promote responsible AI use. As a proof-of-concept, the framework offers controlled exploratory insights, but all reported outcomes reflect text-based pattern generation by an LLM rather than observed human behavior. Future research should validate and extend these findings with longitudinal and field data. Full article
Show Figures

Figure 1

12 pages, 423 KB  
Article
The Criterion Validity of a Newly Developed Ballroom Aerobic Test (BAT) Protocol Against Objective Methods
by Tamara Despot and Davor Plavec
Sports 2025, 13(10), 337; https://doi.org/10.3390/sports13100337 - 1 Oct 2025
Viewed by 158
Abstract
Although laboratory testing to assess aerobic capacity has been a ‘gold standard’ in sports science, its high costs and time-consuming protocols may not be feasible for monitoring and tracking progress in limited conditions. In dancesport athletes, several field-based aerobic tests have been proposed, [...] Read more.
Although laboratory testing to assess aerobic capacity has been a ‘gold standard’ in sports science, its high costs and time-consuming protocols may not be feasible for monitoring and tracking progress in limited conditions. In dancesport athletes, several field-based aerobic tests have been proposed, but the majority of them have been developed for ballet or contemporary dancers at the individual level, while the data among dance couples engaging in standard dance styles is lacking. Therefore, the main purpose of this study was to validate a newly developed Ballroom Aerobic Test (BAT) protocol against objective methods. Twelve standard dancesport couples (age: 20.4 ± 3.9 years; height: 172.1 ± 8.7 cm; weight: 60.1 ± 9.4 kg) with 8.2 ± 3.4 years of training and competing experience participated in this study. Ventilatory and metabolic parameters were generated using the MetaMax® 3B portable gas analyzer (the BAT), while the KF1 (an increase in speed by 0.5 km * h−1 by every minute) and Bruce protocols were followed in laboratory-based settings on the running ergometer. Large to very large correlations were obtained between the BAT and KF1/Bruce protocols for the absolute maximal oxygen uptake (VO2max; r = 0.88 and 0.87) and relative VO2max (r = 0.88 and 0.85), respiratory exchange ratio (RER; r = 0.78 and 0.76), expiratory ventilation (VE; r = 0.86 and 0.79), tidal volume (VT; r = 0.75; 95% CI = 0.57–0.87; p < 0.001), ventilatory equivalent for O2 (VE/VO2; r = 0.81 and 0.80) and CO2 (VE/VCO2; r = 0.78 and 0.82), and dead space (VD/VT; r = 0.70 and 0.74). The Bland–Altman plots indicated no systematic and proportional biases between the BAT and KF1 protocols (standard error of estimate; SEE = ± 3.36 mL * kg−1 * min−1) and the BAT and Bruce protocols (SEE = ± 3.75 mL * kg−1 * min−1). This study shows that the BAT exhibits satisfactory agreement properties against objective methods and is a valid dance protocol to accurately estimate aerobic capacity in dancesport athletes participating in standard dance styles. Full article
(This article belongs to the Special Issue Sport-Specific Testing and Training Methods in Youth)
Show Figures

Figure 1

25 pages, 2465 KB  
Article
On the Spatial Distribution of Eagle Carcasses Around Wind Turbines: Implications for Collision Mortality Estimation
by K. Shawn Smallwood and Douglas A. Bell
Diversity 2025, 17(10), 686; https://doi.org/10.3390/d17100686 - 30 Sep 2025
Viewed by 259
Abstract
With worldwide development of wind energy, birds have grown increasingly vulnerable to collisions with wind turbines. For several species of eagles, which in many countries are accorded special protection due to a host of anthropogenic threats, accurate estimates of collision mortality are needed [...] Read more.
With worldwide development of wind energy, birds have grown increasingly vulnerable to collisions with wind turbines. For several species of eagles, which in many countries are accorded special protection due to a host of anthropogenic threats, accurate estimates of collision mortality are needed to assess impacts and to formulate appropriate mitigation strategies. Unfortunately, estimates of wind turbine collision mortality are often biased low by failing to account for carcasses that fall beyond the fatality search area boundary, B. In some instances, carcass density is modeled across the fatality search area to adjust for these undetected fatalities. Yet for more accurate fatality estimates, it is important to determine B^, the search area boundary within which all carcasses could be found. We used eagle carcass data from multi-year fatality studies conducted at the Island of Smøla, Norway, and the Altamont Pass Wind Resource Area, California, USA, to assess carcass density (i) as a contributor to mortality estimation (ii) as a predictor variable of B, and (iii) to test whether the cumulative carcass counts with increasing distance from the wind turbine can predict B^. We found that carcass counts within 5 m annuli change little with increasing distance from modern wind turbines, and that carcass density is largely a function of the area calculated. Characterization of the spatial distribution of carcasses within the search area varies with the search radius that determines B. However, this may not represent the true spatial distribution of carcasses, including those found beyond B. We assert that the available data are unsuitable for predicting the number of eagle carcasses within and beyond a given search area, nor for determining B^, but they do indicate that B^ lies much farther from wind turbines than previously assumed. Ultimately, modeling available carcass distribution data cannot replace the need for searching farther from wind turbines to account for the true number of eagle collision victims at any given wind project. Full article
Show Figures

Figure 1

12 pages, 1328 KB  
Article
Long-Term Variations in Background Bias and Magnetic Field Noise in HSOS/SMFT Observations
by Haiqing Xu, Hongqi Zhang, Suo Liu, Jiangtao Su, Yuanyong Deng, Shangbin Yang, Mei Zhang and Jiaben Lin
Universe 2025, 11(10), 328; https://doi.org/10.3390/universe11100328 - 28 Sep 2025
Viewed by 173
Abstract
The Solar Magnetic Field Telescope (SMFT) at Huairou Solar Observing Station (HSOS) has conducted continuous observations of solar vector magnetic fields for nearly four decades, and while the primary optical system remains unchanged, critical components—including filters, polarizers, and detectors—have undergone multiple upgrades and [...] Read more.
The Solar Magnetic Field Telescope (SMFT) at Huairou Solar Observing Station (HSOS) has conducted continuous observations of solar vector magnetic fields for nearly four decades, and while the primary optical system remains unchanged, critical components—including filters, polarizers, and detectors—have undergone multiple upgrades and replacements. Maintaining data consistency is essential for reliable long-term studies of magnetic field evolution and solar activity, as well as current helicity. In this study, we systematically analyze background bias and noise levels in SMFT observations from 1988 to 2019. Our dataset comprises 12,281 vector magnetograms of 1484 active regions. To quantify background bias, we computed mean values of Stokes Q/I, U/I and V/I over each entire magnetogram. The background bias of Stokes V/I is small for the whole dataset. The background biases of Stokes Q/I and U/I fluctuate around zero during 1988–2000. From 2001 to 2011, however, the fluctuations in the background bias of both Q/I and U/I become significantly larger, exhibiting mixed positive and negative values. Between 2012 and 2019, the background biases shift to predominantly positive values for both Stokes Q/I and U/I parameters. To address this issue, we propose a potential method for removing the background bias and further discuss its impact on the estimation of current helicity. For each magnetogram, we quantify measurement noise by calculating the standard deviation (σ) of the longitudinal (Bl) and transverse (Bt) magnetic field components within a quiet-Sun region. The noise levels for Bl and Bt components were approximately 15 Gauss (G) and 87 G, respectively, during 1988–2011. Since 2012, these values decreased significantly to ∼6 G for Bl and ∼55 G for Bt, likely due to the installation of a new filter. Full article
Show Figures

Figure 1

22 pages, 12368 KB  
Article
Implementing an Indirect Radar Assimilation Scheme with a 1D Bayesian Retrieval in the Numerical Prediction Model
by Jian Yin, Xiang-Yu Huang, Bing Lu, Min Chen, Yao Sun, Yijie Zhu and Cheng Wang
Remote Sens. 2025, 17(19), 3320; https://doi.org/10.3390/rs17193320 - 27 Sep 2025
Viewed by 271
Abstract
To enhance the operational efficiency of the CMA-BJ3.0 regional numerical model and address the issue of short-term precipitation overforecasting caused by assimilating estimated saturated water vapor, this study investigates the assimilation of radar reflectivity mosaic data by optimizing the configuration of retrieved water [...] Read more.
To enhance the operational efficiency of the CMA-BJ3.0 regional numerical model and address the issue of short-term precipitation overforecasting caused by assimilating estimated saturated water vapor, this study investigates the assimilation of radar reflectivity mosaic data by optimizing the configuration of retrieved water vapor in the indirect assimilation scheme. A 1D (one-dimensional) Bayesian method was employed to retrieve and constrain water vapor from reflectivity observations, generating retrieved water vapor for assimilation to mitigate overforecasting biases. A case study of precipitation on 1 August 2022 was analyzed, with particular focus on comparing the innovation vector statistics, spatial patterns of analysis increments, and physical mechanisms underlying forecast differences across multiple data assimilation configurations. Results showed that an observation-background (O-B) statistical distribution closer to a Gaussian unbiased state indicated a better balance between observations and the background field. The optimized scheme corrected systematic positive biases in water vapor, curbed excessive increments, and effectively resolved the overforecasting issue by refining the initial water vapor field. Batch experiments quantitatively demonstrated that assimilating 1D Bayesian-retrieved water vapor significantly improved precipitation forecast scores, particularly for higher magnitudes (≥25.0 mm/3 h), and reduced the over-forecast within the first 6 h. While the study focused on improving short-term precipitation accuracy without considering hydrometeor impacts or convective dynamics, the 1D Bayesian method, despite its background-dependency, proved effective in correcting water vapor biases, making it a promising assimilation scheme. Full article
Show Figures

Graphical abstract

28 pages, 1485 KB  
Article
Cautious Optimism Building: What HIE Managers Think About Adding Artificial Intelligence to Improve Patient Matching
by Thomas R. Licciardello, David Gefen and Rajiv Nag
Soc. Sci. 2025, 14(10), 579; https://doi.org/10.3390/socsci14100579 - 26 Sep 2025
Viewed by 432
Abstract
Each year an estimated 440,000 medical errors occur in the U.S., of which 38% are a direct result of patient matching errors. As patients seek care in medical facilities, their records are often dispersed. Health Information Exchanges (HIEs) strive to retrieve and consolidate [...] Read more.
Each year an estimated 440,000 medical errors occur in the U.S., of which 38% are a direct result of patient matching errors. As patients seek care in medical facilities, their records are often dispersed. Health Information Exchanges (HIEs) strive to retrieve and consolidate these records and as such, accurate matching of patient data becomes a critical prerequisite. Artificial intelligence (AI) is increasingly being seen as a potential solution to this vexing challenge. We present findings from an exploratory field study involving interviews with 27 HIE executives across the U.S. on tensions they are sensing and balancing in incorporating AI in patient matching processes. Our analysis of data from the interviews reveals, on the one hand, significant optimism regarding AI’s capacity to improve matching processes, and on the other, concerns due to the risks associated with algorithmic biases, uncertainties regarding AI-based decision-making, and implementation hurdles such as costs, the need for specialized talent, and insufficient datasets for training AI models. We conceptualize this dialectical tension in the form of a grounded theory framework on Cautious AI Optimism. Full article
(This article belongs to the Special Issue Technology, Digital Media and Politics)
Show Figures

Figure 1

18 pages, 2325 KB  
Article
Sampling-Based Adaptive Techniques for Reducing Non-Gaussian Position Errors in GNSS/INS Systems
by Yong Hun Kim, Joo Han Lee, Kyeong Wook Seo, Min Ho Lee and Jin Woo Song
Aerospace 2025, 12(10), 863; https://doi.org/10.3390/aerospace12100863 - 24 Sep 2025
Viewed by 249
Abstract
In this paper, we propose a novel method to reduce non-Gaussian errors in measurements using sampling-based distribution estimation. Although non-Gaussian errors are often treated as statistical deviations, they can frequently arise in practical unmanned aerial systems that depend on global navigation satellite systems [...] Read more.
In this paper, we propose a novel method to reduce non-Gaussian errors in measurements using sampling-based distribution estimation. Although non-Gaussian errors are often treated as statistical deviations, they can frequently arise in practical unmanned aerial systems that depend on global navigation satellite systems (GNSS), where position measurements are degraded by multipath effects. However, nonlinear or robust filters have shown limited effectiveness in correcting such errors, particularly when they appear as persistent biases in measurements over time. In such cases, adaptive techniques have often demonstrated greater effectiveness. The proposed method estimates the distribution of observed measurements using a sampling-based approach and derives a reformed measurement from this distribution. By incorporating this reformed measurement into the filter update, the proposed approach achieves lower error levels than traditional adaptive filters. To validate the effectiveness of the method, Kalman filter simulations are conducted for drone GNSS/INS navigation. The results show that the proposed method outperforms conventional non-Gaussian filters in handling measurement bias caused by non-Gaussian errors. Furthermore, it achieves nearly twice the estimation accuracy compared to adaptive approaches. These findings confirm the robustness of the proposed technique in scenarios where measurement accuracy temporarily deteriorates before recovering. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

31 pages, 2653 KB  
Article
A Machine Learning and Econometric Framework for Credibility-Aware AI Adoption Measurement and Macroeconomic Impact Assessment in the Energy Sector
by Adriana AnaMaria Davidescu, Marina-Diana Agafiței, Mihai Gheorghe and Vasile Alecsandru Strat
Mathematics 2025, 13(19), 3075; https://doi.org/10.3390/math13193075 - 24 Sep 2025
Viewed by 430
Abstract
Artificial intelligence (AI) adoption in strategic sectors such as energy is often framed in optimistic narratives, yet its actual economic contribution remains under-quantified. This study proposes a novel, multi-stage methodology at the intersection of machine learning, statistics, and big data analytics to bridge [...] Read more.
Artificial intelligence (AI) adoption in strategic sectors such as energy is often framed in optimistic narratives, yet its actual economic contribution remains under-quantified. This study proposes a novel, multi-stage methodology at the intersection of machine learning, statistics, and big data analytics to bridge this gap. First, we construct a media-derived AI Adoption Score using natural language processing (NLP) techniques, including dictionary-based keyword extraction, sentiment analysis, and zero-shot classification, applied to a large corpus of firm-related news and scientific publications. To enhance reliability, we introduce a Misinformation Bias Score (MBS)—developed via zero-shot classification and named entity recognition—to penalise speculative or biased reporting, yielding a credibility-adjusted adoption metric. Using these scores, we classify firms and apply a Fixed Effects Difference-in-Differences (FE DiD) econometric model to estimate the causal effect of AI adoption on turnover. Finally, we scale firm-level results to the macroeconomic level via a Leontief Input–Output model, quantifying direct, indirect, and induced contributions to GDP and employment. Results show that AI adoption in Romania’s energy sector accounts for up to 42.8% of adopter turnover, contributing 3.54% to national GDP in 2023 and yielding a net employment gain of over 65,000 jobs, despite direct labour displacement. By integrating machine learning-based text analytics, statistical causal inference, and big data-driven macroeconomic modelling, this study delivers a replicable framework for measuring credible AI adoption and its economy-wide impacts, offering valuable insights for policymakers and researchers in digital transformation, energy economics, and sustainable development. Full article
(This article belongs to the Special Issue Machine Learning, Statistics and Big Data, 2nd Edition)
Show Figures

Figure 1

Back to TopTop