Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (105)

Search Parameters:
Keywords = Bayes theorem

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
11 pages, 349 KB  
Article
Sepsis Prediction: Biomarkers Combined in a Bayesian Approach
by João V. B. Cabral, Maria M. B. M. da Silveira, Wilma T. F. Vasconcelos, Amanda T. Xavier, Fábio H. P. C. de Oliveira, Thaysa M. G. A. L. de Menezes, Keylla T. F. Barbosa, Thaisa R. Figueiredo, Jabiael C. da Silva Filho, Tamara Silva, Leuridan C. Torres, Dário C. Sobral Filho and Dinaldo C. de Oliveira
Int. J. Mol. Sci. 2025, 26(15), 7379; https://doi.org/10.3390/ijms26157379 - 30 Jul 2025
Viewed by 734
Abstract
Sepsis is a serious public health problem. sTREM-1 is a marker of inflammatory and infectious processes that has the potential to become a useful tool for predicting the evolution of sepsis. A prediction model for sepsis was constructed by combining sTREM-1, CRP, and [...] Read more.
Sepsis is a serious public health problem. sTREM-1 is a marker of inflammatory and infectious processes that has the potential to become a useful tool for predicting the evolution of sepsis. A prediction model for sepsis was constructed by combining sTREM-1, CRP, and a leukogram via a Bayesian network. A translational study carried out with 32 children with congenital heart disease who had undergone surgical correction at a public referral hospital in Northeast Brazil. In the postoperative period, the mean value of sTREM-1 was greater among patients diagnosed with sepsis than among those not diagnosed with sepsis (394.58 pg/mL versus 239.93 pg/mL, p < 0.001). Analysis of the ROC curve for sTREM-1 and sepsis revealed that the area under the curve was 0.761, with a 95% CI (0.587–0.935) and p = 0.013. With the Bayesian model, we found that a 100% probability of sepsis was related to postoperative blood concentrations of CRP above 71 mg/dL, a leukogram above 14,000 cells/μL, and sTREM-1 concentrations above the cutoff point (283.53 pg/mL). The proposed model using the Bayesian network approach with the combination of CRP, leukocyte count, and postoperative sTREM-1 showed promise for the diagnosis of sepsis. Full article
Show Figures

Figure 1

22 pages, 1718 KB  
Review
A Review on Risk and Reliability Analysis in Photovoltaic Power Generation
by Ahmad Zaki Abdul Karim, Mohamad Shaiful Osman and Mohd. Khairil Rahmat
Energies 2025, 18(14), 3790; https://doi.org/10.3390/en18143790 - 17 Jul 2025
Viewed by 535
Abstract
Precise evaluation of risk and reliability is crucial for decision making and predicting the outcome of investment in a photovoltaic power system (PVPS) due to its intermittent source. This paper explores different methodologies for risk evaluation and reliability assessment, which can be categorized [...] Read more.
Precise evaluation of risk and reliability is crucial for decision making and predicting the outcome of investment in a photovoltaic power system (PVPS) due to its intermittent source. This paper explores different methodologies for risk evaluation and reliability assessment, which can be categorized into qualitative, quantitative, and hybrid qualitative and quantitative (HQQ) approaches. Qualitative methods include failure mode analysis, graphical analysis, and hazard analysis, while quantitative methods include analytical methods, stochastic methods, Bayes’ theorem, reliability optimization, multi-criteria analysis, and data utilization. HQQ methodology combines table-based and visual analysis methods. Currently, reliability assessment techniques such as mean time between failures (MTBF), system average interruption frequency index (SAIFI), and system average interruption duration index (SAIDI) are commonly used to predict PVPS performance. However, alternative methods such as economical metrics like the levelized cost of energy (LCOE) and net present value (NPV) can also be used. Therefore, a risk and reliability approach should be applied together to improve the accuracy of predicting significant aspects in the photovoltaic industry. Full article
(This article belongs to the Section B: Energy and Environment)
Show Figures

Figure 1

15 pages, 3145 KB  
Article
Probabilistic Prediction of Spudcan Bearing Capacity in Stiff-over-Soft Clay Based on Bayes’ Theorem
by Zhaoyu Sun, Pan Gao, Yanling Gao, Jianze Bi and Qiang Gao
J. Mar. Sci. Eng. 2025, 13(7), 1344; https://doi.org/10.3390/jmse13071344 - 14 Jul 2025
Viewed by 332
Abstract
During offshore operations of jack-up platforms, the spudcan may experience sudden punch-through failure when penetrating from an overlying stiff clay layer into the underlying soft clay, posing significant risks to platform safety. Conventional punch-through prediction methods, which rely on predetermined soil parameters, exhibit [...] Read more.
During offshore operations of jack-up platforms, the spudcan may experience sudden punch-through failure when penetrating from an overlying stiff clay layer into the underlying soft clay, posing significant risks to platform safety. Conventional punch-through prediction methods, which rely on predetermined soil parameters, exhibit limited accuracy as they fail to account for uncertainties in seabed stratigraphy and soil properties. To address this limitation, based on a database of centrifuge model tests, a probabilistic prediction framework for the peak resistance and corresponding depth is developed by integrating empirical prediction formulas based on Bayes’ theorem. The proposed Bayesian methodology effectively refines prediction accuracy by quantifying uncertainties in soil parameters, spudcan geometry, and computational models. Specifically, it establishes prior probability distributions of peak resistance and depth through Monte Carlo simulations, then updates these distributions in real time using field monitoring data during spudcan penetration. The results demonstrate that both the recommended method specified in ISO 19905-1 and an existing deterministic model tend to yield conservative estimates. This approach can significantly improve the predicted accuracy of the peak resistance compared with deterministic methods. Additionally, it shows that the most probable failure zone converges toward the actual punch-through point as more monitoring data is incorporated. The enhanced prediction capability provides critical decision support for mitigating punch-through potential during offshore jack-up operations, thereby advancing the safety and reliability of marine engineering practices. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

20 pages, 7291 KB  
Article
Mapping Delayed Canopy Loss and Durable Fire Refugia for the 2020 Wildfires in Washington State Using Multiple Sensors
by Anika M. Anderson, Meg A. Krawchuk, Flavie Pelletier and Jeffrey A. Cardille
Fire 2025, 8(6), 230; https://doi.org/10.3390/fire8060230 - 11 Jun 2025
Viewed by 1359
Abstract
Fire refugia are unburned and low severity patches within wildfires that contribute heterogeneity that is important to retaining biodiversity and regenerating forest following fire. With increasingly intense and frequent wildfires in the Pacific Northwest, fire refugia are important for re-establishing populations sensitive to [...] Read more.
Fire refugia are unburned and low severity patches within wildfires that contribute heterogeneity that is important to retaining biodiversity and regenerating forest following fire. With increasingly intense and frequent wildfires in the Pacific Northwest, fire refugia are important for re-establishing populations sensitive to fire and maintaining resilience to future disturbances. Mapping fire refugia and delayed canopy loss is useful for understanding patterns in their distribution. The increasing abundance of satellite data and advanced analysis platforms offer the potential to map fire refugia in high detail. This study uses the Bayesian Updating of Land Cover (BULC-D) algorithm to map fire refugia and delayed canopy loss three years after fire. The algorithm compiles Normalized Burn Ratio data from Sentinel-2 and Landsat 8 and 9 and uses Bayes’ Theorem to map land cover changes. Four wildfires that occurred across Washington State in 2020 were mapped. Additionally, to consider the longevity of ‘durable’ fire refugia, the fire perimeters were analyzed to map delayed canopy loss in the years 2021–2023. The results showed that large losses in fire refugia can occur in the 1–3 years after fire due to delayed effects, but with some patches enduring. Full article
Show Figures

Figure 1

22 pages, 1817 KB  
Article
Umbrella Refinement of Ensembles—An Alternative View of Ensemble Optimization
by Johannes Stöckelmaier, Tümay Capraz and Chris Oostenbrink
Molecules 2025, 30(11), 2449; https://doi.org/10.3390/molecules30112449 - 3 Jun 2025
Cited by 1 | Viewed by 641
Abstract
The elucidation of protein dynamics, especially in the context of intrinsically disordered proteins, is challenging and requires cooperation between experimental studies and computational analysis. Molecular dynamics simulations are an essential investigation tool but often struggle to accurately quantify the conformational preferences of flexible [...] Read more.
The elucidation of protein dynamics, especially in the context of intrinsically disordered proteins, is challenging and requires cooperation between experimental studies and computational analysis. Molecular dynamics simulations are an essential investigation tool but often struggle to accurately quantify the conformational preferences of flexible proteins. To create a quantitatively validated conformational ensemble, such simulations may be refined with experimental data using Bayesian and maximum entropy methods. In this study, we present a method to optimize a conformational ensemble using Bayes’ theorem in connection with a methodology derived from Umbrella Sampling. The resulting method, called the Umbrella Refinement of Ensembles (URE), reduces the number of parameters to be optimized in comparison to the classical Bayesian Ensemble Refinement and remains methodologically suitable for use with the forward formulated Kullback–Leibler divergence. The method is validated using two established systems, an alanine–alanine zwitterion and the chignolin peptide, using nuclear magnetic resonance data from the literature. Full article
Show Figures

Graphical abstract

24 pages, 3798 KB  
Article
Stochastic Optimal Control for Uncertain Structural Systems Under Random Excitations Based on Bayes Optimal Estimation
by Hua Lei, Zhao-Zhong Ying and Zu-Guang Ying
Buildings 2025, 15(9), 1579; https://doi.org/10.3390/buildings15091579 - 7 May 2025
Cited by 1 | Viewed by 472
Abstract
Stochastic vibration control of uncertain structures under random loading is an important problem and its minimax optimal control strategy remains to be developed. In this paper, a stochastic optimal control strategy for uncertain structural systems under random excitations is proposed, based on the [...] Read more.
Stochastic vibration control of uncertain structures under random loading is an important problem and its minimax optimal control strategy remains to be developed. In this paper, a stochastic optimal control strategy for uncertain structural systems under random excitations is proposed, based on the minimax stochastic dynamical programming principle and the Bayes optimal estimation method with the combination of stochastic dynamics and Bayes inference. The general description of the stochastic optimal control problem is presented including optimal parameter estimation and optimal state control. For the estimation, the posterior probability density conditional on observation states is expressed using the likelihood function conditional on system parameters according to Bayes’ theorem. The likelihood is replaced by the geometrically averaged likelihood, and the posterior is converted into its logarithmic expression to avoid numerical singularity. The expressions of state statistics are derived based on stochastic dynamics. The statistics are further transformed into those conditional on observation states based on optimal state estimation. Then, the obtained posterior will be more reliable and accurate, and the optimal estimation will greatly reduce uncertain parameter domains. For the control, the minimax strategy is designed by minimizing the performance index for the worst-parameter system, which is obtained by maximizing the performance index based on game theory. The dynamical programming equation for the uncertain system is derived according to the minimax stochastic dynamical programming principle. The worst parameters are determined by the maximization of the equation, and the optimal control is determined by the minimization of the resulting equation. The minimax optimal control by combining the Bayes optimal estimation and minimax stochastic dynamical programming will be more effective and robust. Finally, numerical results for a five-story frame structure under random excitations show the control effectiveness of the proposed strategy. Full article
(This article belongs to the Special Issue The Vibration Control of Building Structures)
Show Figures

Figure 1

15 pages, 5400 KB  
Article
Rapid Damage Assessment and Bayesian-Based Debris Prediction for Building Clusters Against Earthquakes
by Xiaowei Zheng, Yaozu Hou, Jie Cheng, Shuai Xu and Wenming Wang
Buildings 2025, 15(9), 1481; https://doi.org/10.3390/buildings15091481 - 27 Apr 2025
Cited by 3 | Viewed by 575
Abstract
In the whole service life of building clusters, they will encounter multiple hazards, including the disaster chain of earthquakes and building debris. The falling debris may block the post-earthquake roads and even severely affect the evacuation, emergency, and recovery operations. It is of [...] Read more.
In the whole service life of building clusters, they will encounter multiple hazards, including the disaster chain of earthquakes and building debris. The falling debris may block the post-earthquake roads and even severely affect the evacuation, emergency, and recovery operations. It is of great significance to develop a surrogate model for predicting seismic responses of building clusters as well as a prediction model of post-earthquake debris. This paper presents a general methodology for developing a surrogate model for rapid seismic responses calculation of building clusters and probabilistic prediction model of debris width. Firstly, the building cluster is divided into several types of representative buildings according to the building function. Secondly, the finite element (FE) method and discrete element (DE) method are, respectively, used to generate the data pool of structural floor responses and debris width. Finally, with the structural response data of maximum floor displacement, a surrogate model for rapidly calculating seismic responses of structures is developed based on the XGBoost algorithm, achieving R2 > 0.99 for floor displacements and R2 = 0.989 for maximum inter-story drift ratio (MIDR) predictions. In addition, an unbiased probabilistic prediction model for debris width of blockage is established with Bayesian updating rule, reducing the standard deviation of model error by 60% (from σ = 10.2 to σ = 4.1). The presented models are applied to evaluate the seismic damage of the campus building complex in China University of Mining and Technology, and then to estimate the range of post-earthquake falling debris. The results indicate that the surrogate model reduces computational time by over 90% compared to traditional nonlinear time-history analysis. The application in this paper is helpful for the development of disaster prevention and mitigation policies as well as the post-earthquake rescue and evacuation strategies for urban building complexes. Full article
Show Figures

Figure 1

23 pages, 2975 KB  
Article
Coevolutionary Algorithm with Bayes Theorem for Constrained Multiobjective Optimization
by Shaoyu Zhao, Heming Jia, Yongchao Li and Qian Shi
Mathematics 2025, 13(7), 1191; https://doi.org/10.3390/math13071191 - 4 Apr 2025
Viewed by 424
Abstract
The effective resolution of constrained multi-objective optimization problems (CMOPs) requires a delicate balance between maximizing objectives and satisfying constraints. Previous studies have demonstrated that multi-swarm optimization models exhibit robust performance in CMOPs; however, their high computational resource demands can hinder convergence efficiency. This [...] Read more.
The effective resolution of constrained multi-objective optimization problems (CMOPs) requires a delicate balance between maximizing objectives and satisfying constraints. Previous studies have demonstrated that multi-swarm optimization models exhibit robust performance in CMOPs; however, their high computational resource demands can hinder convergence efficiency. This article proposes an environment selection model based on Bayes’ theorem, leveraging the advantages of dual populations. The model constructs prior knowledge using objective function values and constraint violation values, and then, it integrates this information to enhance selection processes. By dynamically adjusting the selection of the auxiliary population based on prior knowledge, the algorithm significantly improves its adaptability to various CMOPs. Additionally, a population size adjustment strategy is introduced to mitigate the computational burden of dual populations. By utilizing past prior knowledge to estimate the probability of function value changes, offspring allocation is dynamically adjusted, optimizing resource utilization. This adaptive adjustment prevents unnecessary computational waste during evolution, thereby enhancing both convergence and diversity. To validate the effectiveness of the proposed algorithm, comparative experiments were performed against seven constrained multi-objective optimization algorithms (CMOEAs) across three benchmark test sets and 12 real-world problems. The results show that the proposed algorithm outperforms the others in both convergence and diversity. Full article
Show Figures

Figure 1

40 pages, 1167 KB  
Article
A Hyperbolic Sum Rule for Probability: Solving Recursive (“Chicken and Egg”) Problems
by Michael C. Parker, Chris Jeynes and Stuart D. Walker
Entropy 2025, 27(4), 352; https://doi.org/10.3390/e27040352 - 28 Mar 2025
Viewed by 955
Abstract
We prove that the probability of “A or B”, denoted as p(A or B), where A and B are events or hypotheses that may be recursively dependent, is given by a “Hyperbolic Sum Rule” [...] Read more.
We prove that the probability of “A or B”, denoted as p(A or B), where A and B are events or hypotheses that may be recursively dependent, is given by a “Hyperbolic Sum Rule” (HSR), which is relationally isomorphic to the hyperbolic tangent double-angle formula. We also prove that this HSR is Maximum Entropy (MaxEnt). Since this recursive dependency is commutative, it maintains the symmetry between the two events, while the recursiveness also represents temporal symmetry within the logical structure of the HSR. The possibility of recursive probabilities is excluded by the “Conventional Sum Rule” (CSR), which we have also proved to be MaxEnt (with lower entropy than the HSR due to its narrower domain of applicability). The concatenation property of the HSR is exploited to enable analytical, consistent, and scalable calculations for multiple hypotheses. Although they are intrinsic to current artificial intelligence and machine learning applications, such calculations are not conveniently available for the CSR, moreover they are presently considered intractable for analytical study and methodological validation. Where, for two hypotheses, we have p(A|B) > 0 and p(B|A) > 0 together (where “A|B” means “A given B”), we show that either {A,B} is independent or {A,B} is recursively dependent. In general, recursive relations cannot be ruled out: the HSR should be used by default. Because the HSR is isomorphic to other physical quantities, including those of certain components that are important for digital signal processing, we also show that it is as reasonable to state that “probability is physical” as it is to state that “information is physical” (which is now recognised as a truism of communications network engineering); probability is not merely a mathematical construct. We relate this treatment to the physics of Quantitative Geometrical Thermodynamics, which is defined in complex hyperbolic (Minkowski) spacetime. Full article
Show Figures

Figure 1

13 pages, 819 KB  
Review
Should Medical Experts Giving Evidence in Criminal Trials Adhere to EFNSI Forensic Guidelines in Evaluative Reporting
by Neil Allan Robertson Munro
Forensic Sci. 2025, 5(1), 13; https://doi.org/10.3390/forensicsci5010013 - 17 Mar 2025
Viewed by 642
Abstract
Miscarriages of justice led to concerns that forensic science reports were prosecution-biassed and led to elementary errors of probability. The European Network of Forensic Science Institutes (EFNSI) and other institutes developed standards requiring reporting of the probability of evidence under all hypotheses (usually [...] Read more.
Miscarriages of justice led to concerns that forensic science reports were prosecution-biassed and led to elementary errors of probability. The European Network of Forensic Science Institutes (EFNSI) and other institutes developed standards requiring reporting of the probability of evidence under all hypotheses (usually prosecution and defence hypotheses) with the likelihood ratio (LR). LR=pEHppEHd, values > 1, being probative for a prosecution hypothesis. In elementary two-variable conditional probability theory (Baye’s theorem), the LR is also an updating factor which multiplies the odds of guilt for each item of evidence considered. Although this is not true for multiple-variable probability theory, the value of the LR as a valid measure of evidential probity remains. Forensic scientists are experts in evidence and should not stray into the role of the Court to consider the probability of the hypotheses given the totality of the evidence: pHp,Hd,E1,E2En. Medical experts may be required to assist the court with diagnoses (the hypothesis), but this privilege is balanced by vigilance that experts do not stray beyond their expertise. A narrow interpretation of expertise hinders the evaluation of the evidence under hypotheses adjacent to the area of expertise. This paradox may be overcome by experts declaring competence in areas adjacent to their main area of expertise. Regulatory bodies do not currently require medical experts to adhere to EFNSI guidelines in evaluative reporting. Legal opinion is divided on whether probability theory can be applied to cases requiring medical expertise. Medical experts should, in their reports, clearly separate evaluating the probability of the evidence (where evaluative reporting should apply) and evaluating the probability of hypotheses where methodology should be prioritised over opinion. The reckless misapplication of elementary probability theory, typically transposing conditional probabilities or neglecting prior odds, may lead to the jury being misled into believing posterior odds of guilt are many orders of magnitude greater than reality. Medical experts should declare training in elementary probability theory. Inaccurate probabilities are a joint enterprise between all who inform or advise the jury, so all must be trained in elementary probability theory. Full article
Show Figures

Figure 1

20 pages, 3976 KB  
Article
Machine Learning for Quality Diagnostics: Insights into Consumer Electronics Evaluation
by Najada Firza, Anisa Bakiu and Alfonso Monaco
Electronics 2025, 14(5), 939; https://doi.org/10.3390/electronics14050939 - 27 Feb 2025
Viewed by 948
Abstract
In the era of digital commerce, understanding consumer opinions has become crucial for businesses aiming to tailor their products and services effectively. This study investigates acoustic quality diagnostics of the latest generation of AirPods. From this perspective, the work examines consumer sentiment using [...] Read more.
In the era of digital commerce, understanding consumer opinions has become crucial for businesses aiming to tailor their products and services effectively. This study investigates acoustic quality diagnostics of the latest generation of AirPods. From this perspective, the work examines consumer sentiment using text mining and sentiment analysis techniques applied to product reviews, focusing on Amazon’s AirPods reviews. Using the naïve Bayes classifier, a probabilistic machine learning approach grounded in Bayes’ theorem, this research analyzes textual data to classify consumer reviews as positive or negative. Data were collected via web scraping, following ethical guidelines, and preprocessed to ensure quality and relevance. Textual features were transformed using term frequency-inverse document frequency (TF-IDF) to create input vectors for the classifier. The results reveal that naïve Bayes provides satisfactory performance in categorizing sentiment, with metrics such as accuracy, sensitivity, specificity, and F1-score offering insight into the model’s effectiveness. Key findings highlight the divergence in consumer perception across ratings, identifying sentiment drivers such as noise cancellation quality and product integration. These insights underline the potential of sentiment analysis in enabling companies to address consumer concerns, improve offerings, and optimize business strategies. The study concludes that such methodologies are indispensable for leveraging consumer feedback in the rapidly evolving digital marketplace. Full article
Show Figures

Figure 1

15 pages, 331 KB  
Article
Analyzing Sample Size in Information-Theoretic Models
by D. Bernal-Casas and J. M. Oller
Mathematics 2024, 12(24), 4018; https://doi.org/10.3390/math12244018 - 21 Dec 2024
Viewed by 792
Abstract
In this paper, we delve into the complexities of information-theoretic models, specifically focusing on how we can model sample size and how it affects our previous findings. This question is fundamental and intricate, posing a significant intellectual challenge to our research. While previous [...] Read more.
In this paper, we delve into the complexities of information-theoretic models, specifically focusing on how we can model sample size and how it affects our previous findings. This question is fundamental and intricate, posing a significant intellectual challenge to our research. While previous studies have considered a fixed sample size, this work explores other possible alternatives to assess its impact on the mathematical approach. To ensure that our framework aligns with the principles of quantum theory, specific conditions related to sample size must be met, as they are inherently linked to information quantities. The arbitrary nature of sample size presents a significant challenge in achieving this alignment, which we thoroughly investigate in this study. Full article
4 pages, 175 KB  
Reply
Reply to Onkenhout et al. Comment on “van Gemert et al. Child Abuse, Misdiagnosed by an Expertise Center—Part II—Misuse of Bayes’ Theorem. Children 2023, 10, 843”
by Martin J. C. van Gemert, Marianne Vlaming, Peter G. J. Nikkels and Peter J. van Koppen
Children 2024, 11(12), 1430; https://doi.org/10.3390/children11121430 - 26 Nov 2024
Viewed by 680
Abstract
We thank the authors for their Comments [...] Full article
9 pages, 213 KB  
Comment
Comment on van Gemert et al. Child Abuse, Misdiagnosed by an Expertise Center—Part II—Misuse of Bayes’ Theorem. Children 2023, 10, 843
by Nina H. Onkenhout, Heike C. Terlingen, Michelle Nagtegaal, Elise M. van de Putte, Stephen C. Boos and Charles E. H. Berger
Children 2024, 11(12), 1429; https://doi.org/10.3390/children11121429 - 26 Nov 2024
Cited by 1 | Viewed by 957
Abstract
Recently, part I and part II of a series of three papers were published, namely the papers by Vlaming et al [...] Full article
14 pages, 647 KB  
Article
Bayesian Knowledge Infusion for Studying Historical Sunspot Numbers
by Wenxin Jiang and Haisheng Ji
Universe 2024, 10(9), 370; https://doi.org/10.3390/universe10090370 - 14 Sep 2024
Viewed by 910
Abstract
A scientific method that proposes a value Y to estimate a target value ρ is often subject to some level of uncertainty. In the Bayesian framework, the level of uncertainty can be measured by the width of the 68% interval, which is [...] Read more.
A scientific method that proposes a value Y to estimate a target value ρ is often subject to some level of uncertainty. In the Bayesian framework, the level of uncertainty can be measured by the width of the 68% interval, which is the range of the middle 68% of the ranked ρ values sampled from the posterior distribution p(ρ|Y). This paper considers Bayesian knowledge infusion (BKI) to reduce the uncertainty of the posterior distribution p(ρ|Y) based on additional knowledge that an event A happens. BKI is achieved by using a conditional prior distribution p(ρ|A) in the Bayes theorem, assuming that given the true ρ, its error-contaminated value Y is independent of event A. We use two examples to illustrate how to study whether or not it is possible to reduce uncertainty from 14C reconstruction (Y) of the annual sunspot number (SSN) (ρ) by infusing additional information (A) using BKI. Information (A) that SSN is from a year that has a Far Eastern record of naked eye sunspots is found to be not so effective in reducing the uncertainty. In contrast, information that SSN is from a year at a cycle minimum is found to be very effective, producing much narrower 68% intervals. The resulting Bayesian point estimates of SSN (the posterior medians of ρ) are cross-validated and tested on a subset of telescopically observed SSNs that were unused in the process of Bayes computation. Full article
(This article belongs to the Section Astroinformatics and Astrostatistics)
Show Figures

Figure 1

Back to TopTop