entropy-logo

Journal Browser

Journal Browser

Bayesianism

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (15 June 2024) | Viewed by 19206

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Mathematics and Statistics, University of São Paulo, Rua do Matão, 1010, São Paulo 05508-900, Brazil
Interests: Bayesian statistics; controversies and paradoxes in probability and statistics; Bayesian reliability; Bayesian analysis of discrete data (BADD); applied statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Statistics, Federal University of Bahia, Salvador 40170-110, Brazil
Interests: statistical learning; time series forecasting; robust statistics; data science; applied statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Independent researcher
Interests: hypothesis testing; fundamentals of statistical inference; regression models; analysis of sports data; theoretical physics; probability theory and stochastic processes

Special Issue Information

Dear Colleagues,

Many statistics textbooks treat frequentist statistics as the main subject and relegate Bayesian statistics to a single chapter, often near the end of the book. Students are often told that there are two "schools'' of statistics, the frequentist (or “classical”) and the Bayesian, and that the two are in opposition to each other. Students are implicitly, or even sometimes explicitly, asked to choose between them. The reality, of course, is much more complex. There are other views and tools—likelihoodism, decision theory, “objective Bayesianism,” and fiducial inference, for example—meant to sit between the supposed extremes of subjective Bayesianism and frequentism. Some use priors and posteriors that are not probability functions (do not integrate into unity). There are statisticians, the chief editor of this Special Issue being one of them, who are considered Bayesians, but who have used frequentist techniques or applied frequentist concepts in their work. It is worth remarking here that some of the main tools of Bayesian statistics used in the 21st Century are based on frequencies in simulations collectively known as Markov chain Monte Carlo. Bayesian methods have also been used by statisticians considered frequentists to solve problems that arise in frequentist statistics. Even Sir Ronald Fisher, who first proposed fiducial inference and is considered the “founding father” of frequentist inference, was in his later works moving toward some of the inductive arguments of Bayesian inference and emphasizing the likelihoods stronger. 

In the end, statistics comes down to trying to infer something about a larger population from a smaller sample, and any approach to such a task will have strengths and weaknesses, will surely have pathological cases it cannot resolve, and will be subject to valid criticisms. Therefore, it is not surprising that mixing the ideas of the different “schools” of inference has been a successful approach and has expanded and enriched the palette of tools available to researchers in every quantitative field of study.  

The idea of this Special Issue is to treat Bayesian statistics as the main topic, but that does not mean it is to be treated as superior to any other paradigm of inference.  Contributions from practitioners and theoreticians using and advancing Bayesian thoughts and methods are obviously welcome, but so are contributions from those who use other paradigms, including criticisms of aspects of Bayesian inference in comparison to the authors' preferred methods. Our idea is to provide a snapshot of how Bayesian inference is understood and how it contributes to scientific endeavor today, late in the first quarter of the 21st Century.

When we speak of Bayesianism, we are referring to a philosophical and statistical framework that involves the representation of degrees of belief or justification using probabilities. It is characterized by the idea that belief comes in degrees that can be formalized using the axioms of probability theory. Bayesianism involves the assessment of the rationality of degrees of belief based on a set of rules, and these beliefs can be updated using Bayes's theorem based on new information or evidence.

Prof. Dr. Carlos Alberto De Bragança Pereira
Prof. Dr. Paulo Canas Rodrigues
Dr. Mark Andrew Gannon
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • prior distributions
  • posterior probabilities or densities
  • likelihood optimizations: weighted average or maximization

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (16 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

14 pages, 290 KiB  
Article
Bayesian Assessment of Corrosion-Related Failures in Steel Pipelines
by Fabrizio Ruggeri, Enrico Cagno, Franco Caron, Mauro Mancini and Antonio Pievatolo
Entropy 2024, 26(12), 1111; https://doi.org/10.3390/e26121111 - 19 Dec 2024
Viewed by 291
Abstract
The probability of gas escapes from steel pipelines due to different types of corrosion is studied with real failure data from an urban gas distribution network. Both the design and maintenance of the network are considered, identifying and estimating (in a Bayesian framework) [...] Read more.
The probability of gas escapes from steel pipelines due to different types of corrosion is studied with real failure data from an urban gas distribution network. Both the design and maintenance of the network are considered, identifying and estimating (in a Bayesian framework) an elementary multinomial model in the first case, and a more sophisticated non-homogeneous Poisson process in the second case. Special attention is paid to the elicitation of the experts’ opinions. We conclude that the corrosion process behaves quite differently depending on the type of corrosion, and that, in most cases, cathodically protected pipes should be installed. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

19 pages, 691 KiB  
Article
A Bayesian Approach for Modeling and Forecasting Solar Photovoltaic Power Generation
by Mariana Villela Flesch, Carlos Alberto de Bragança Pereira and Erlandson Ferreira Saraiva
Entropy 2024, 26(10), 824; https://doi.org/10.3390/e26100824 - 27 Sep 2024
Viewed by 880
Abstract
In this paper, we propose a Bayesian approach to estimate the curve of a function f(·) that models the solar power generated at k moments per day for n days and to forecast the curve for the [...] Read more.
In this paper, we propose a Bayesian approach to estimate the curve of a function f(·) that models the solar power generated at k moments per day for n days and to forecast the curve for the (n+1)th day by using the history of recorded values. We assume that f(·) is an unknown function and adopt a Bayesian model with a Gaussian-process prior on the vector of values f(t)=f(1),, f(k). An advantage of this approach is that we may estimate the curves of f(·) and fn+1(·) as “smooth functions” obtained by interpolating between the points generated from a k-variate normal distribution with appropriate mean vector and covariance matrix. Since the joint posterior distribution for the parameters of interest does not have a known mathematical form, we describe how to implement a Gibbs sampling algorithm to obtain estimates for the parameters. The good performance of the proposed approach is illustrated using two simulation studies and an application to a real dataset. As performance measures, we calculate the absolute percentage error, the mean absolute percentage error (MAPE), and the root-mean-square error (RMSE). In all simulated cases and in the application to real-world data, the MAPE and RMSE values were all near 0, indicating the very good performance of the proposed approach. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

19 pages, 353 KiB  
Article
Relative Belief Inferences from Decision Theory
by Michael Evans and Gun Ho Jang
Entropy 2024, 26(9), 786; https://doi.org/10.3390/e26090786 - 14 Sep 2024
Viewed by 595
Abstract
Relative belief inferences are shown to arise as Bayes rules or limiting Bayes rules. These inferences are invariant under reparameterizations and possess a number of optimal properties. In particular, relative belief inferences are based on a direct measure of statistical evidence. Full article
(This article belongs to the Special Issue Bayesianism)
12 pages, 815 KiB  
Article
Dose Finding in Oncology Trials Guided by Ordinal Toxicity Grades Using Continuous Dose Levels
by Mourad Tighiouart and André Rogatko
Entropy 2024, 26(8), 687; https://doi.org/10.3390/e26080687 - 14 Aug 2024
Viewed by 874
Abstract
We present a Bayesian adaptive design for dose finding in oncology trials with application to a first-in-human trial. The design is based on the escalation with overdose control principle and uses an intermediate grade 2 toxicity in addition to the traditional binary indicator [...] Read more.
We present a Bayesian adaptive design for dose finding in oncology trials with application to a first-in-human trial. The design is based on the escalation with overdose control principle and uses an intermediate grade 2 toxicity in addition to the traditional binary indicator of dose-limiting toxicity (DLT) to guide the dose escalation and de-escalation. We model the dose–toxicity relationship using the proportional odds model. This assumption satisfies an important ethical concern when a potentially toxic drug is first introduced in the clinic; if a patient experiences grade 2 toxicity at the most, then the amount of dose escalation is lower relative to that wherein if this patient experienced a maximum of grade 1 toxicity. This results in a more careful dose escalation. The performance of the design was assessed by deriving the operating characteristics under several scenarios for the true MTD and expected proportions of grade 2 toxicities. In general, the trial design is safe and achieves acceptable efficiency of the estimated MTD for a planned sample size of twenty patients. At the time of writing this manuscript, twelve patients have been enrolled to the trial. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

22 pages, 3992 KiB  
Article
Bayesian Modeling for Nonstationary Spatial Point Process via Spatial Deformations
by Dani Gamerman, Marcel de Souza Borges Quintana and Mariane Branco Alves
Entropy 2024, 26(8), 678; https://doi.org/10.3390/e26080678 - 11 Aug 2024
Viewed by 942
Abstract
Many techniques have been proposed to model space-varying observation processes with a nonstationary spatial covariance structure and/or anisotropy, usually on a geostatistical framework. Nevertheless, there is an increasing interest in point process applications, and methodologies that take nonstationarity into account are welcomed. In [...] Read more.
Many techniques have been proposed to model space-varying observation processes with a nonstationary spatial covariance structure and/or anisotropy, usually on a geostatistical framework. Nevertheless, there is an increasing interest in point process applications, and methodologies that take nonstationarity into account are welcomed. In this sense, this work proposes an extension of a class of spatial Cox process using spatial deformation. The proposed method enables the deformation behavior to be data-driven, through a multivariate latent Gaussian process. Inference leads to intractable posterior distributions that are approximated via MCMC. The convergence of algorithms based on the Metropolis–Hastings steps proved to be slow, and the computational efficiency of the Bayesian updating scheme was improved by adopting Hamiltonian Monte Carlo (HMC) methods. Our proposal was also compared against an alternative anisotropic formulation. Studies based on synthetic data provided empirical evidence of the benefit brought by the adoption of nonstationarity through our anisotropic structure. A real data application was conducted on the spatial spread of the Spodoptera frugiperda pest in a corn-producing agricultural area in southern Brazil. Once again, the proposed method demonstrated its benefit over alternatives. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

18 pages, 516 KiB  
Article
Likelihood Inference for Factor Copula Models with Asymmetric Tail Dependence
by Harry Joe and Xiaoting Li
Entropy 2024, 26(7), 610; https://doi.org/10.3390/e26070610 - 19 Jul 2024
Viewed by 987
Abstract
For multivariate non-Gaussian involving copulas, likelihood inference is dominated by the data in the middle, and fitted models might not be very good for joint tail inference, such as assessing the strength of tail dependence. When preliminary data and likelihood analysis suggest asymmetric [...] Read more.
For multivariate non-Gaussian involving copulas, likelihood inference is dominated by the data in the middle, and fitted models might not be very good for joint tail inference, such as assessing the strength of tail dependence. When preliminary data and likelihood analysis suggest asymmetric tail dependence, a method is proposed to improve extreme value inferences based on the joint lower and upper tails. A prior that uses previous information on tail dependence can be used in combination with the likelihood. With the combination of the prior and the likelihood (which in practice has some degree of misspecification) to obtain a tilted log-likelihood, inferences with suitably transformed parameters can be based on Bayesian computing methods or with numerical optimization of the tilted log-likelihood to obtain the posterior mode and Hessian at this mode. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

16 pages, 471 KiB  
Article
A Metric Based on the Efficient Determination Criterion
by Jesús E. García, Verónica A. González-López and Johsac I. Gomez Sanchez
Entropy 2024, 26(6), 526; https://doi.org/10.3390/e26060526 - 19 Jun 2024
Viewed by 736
Abstract
This paper extends the concept of metrics based on the Bayesian information criterion (BIC), to achieve strongly consistent estimation of partition Markov models (PMMs). We introduce a set of metrics drawn from the family of model selection criteria known as efficient determination criteria [...] Read more.
This paper extends the concept of metrics based on the Bayesian information criterion (BIC), to achieve strongly consistent estimation of partition Markov models (PMMs). We introduce a set of metrics drawn from the family of model selection criteria known as efficient determination criteria (EDC). This generalization extends the range of options available in BIC for penalizing the number of model parameters. We formally specify the relationship that determines how EDC works when selecting a model based on a threshold associated with the metric. Furthermore, we improve the penalty options within EDC, identifying the penalty ln(ln(n)) as a viable choice that maintains the strongly consistent estimation of a PMM. To demonstrate the utility of these new metrics, we apply them to the modeling of three DNA sequences of dengue virus type 3, endemic in Brazil in 2023. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

13 pages, 18755 KiB  
Article
Bayesian Spatio-Temporal Modeling of the Dynamics of COVID-19 Deaths in Peru
by César Raúl Castro Galarza, Omar Nolberto Díaz Sánchez, Jonatha Sousa Pimentel, Rodrigo Bulhões, Javier Linkolk López-Gonzales and Paulo Canas Rodrigues
Entropy 2024, 26(6), 474; https://doi.org/10.3390/e26060474 - 30 May 2024
Cited by 1 | Viewed by 842
Abstract
Amid the COVID-19 pandemic, understanding the spatial and temporal dynamics of the disease is crucial for effective public health interventions. This study aims to analyze COVID-19 data in Peru using a Bayesian spatio-temporal generalized linear model to elucidate mortality patterns and assess the [...] Read more.
Amid the COVID-19 pandemic, understanding the spatial and temporal dynamics of the disease is crucial for effective public health interventions. This study aims to analyze COVID-19 data in Peru using a Bayesian spatio-temporal generalized linear model to elucidate mortality patterns and assess the impact of vaccination efforts. Leveraging data from 194 provinces over 651 days, our analysis reveals heterogeneous spatial and temporal patterns in COVID-19 mortality rates. Higher vaccination coverage is associated with reduced mortality rates, emphasizing the importance of vaccination in mitigating the pandemic’s impact. The findings underscore the value of spatio-temporal data analysis in understanding disease dynamics and guiding targeted public health interventions. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

19 pages, 2693 KiB  
Article
Bayesian Non-Parametric Inference for Multivariate Peaks-over-Threshold Models
by Peter Trubey and Bruno Sansó
Entropy 2024, 26(4), 335; https://doi.org/10.3390/e26040335 - 14 Apr 2024
Viewed by 1571
Abstract
We consider a constructive definition of the multivariate Pareto that factorizes the random vector into a radial component and an independent angular component. The former follows a univariate Pareto distribution, and the latter is defined on the surface of the positive orthant of [...] Read more.
We consider a constructive definition of the multivariate Pareto that factorizes the random vector into a radial component and an independent angular component. The former follows a univariate Pareto distribution, and the latter is defined on the surface of the positive orthant of the infinity norm unit hypercube. We propose a method for inferring the distribution of the angular component by identifying its support as the limit of the positive orthant of the unit p-norm spheres and introduce a projected gamma family of distributions defined through the normalization of a vector of independent random gammas to the space. This serves to construct a flexible family of distributions obtained as a Dirichlet process mixture of projected gammas. For model assessment, we discuss scoring methods appropriate to distributions on the unit hypercube. In particular, working with the energy score criterion, we develop a kernel metric that produces a proper scoring rule and presents a simulation study to compare different modeling choices using the proposed metric. Using our approach, we describe the dependence structure of extreme values in the integrated vapor transport (IVT), data describing the flow of atmospheric moisture along the coast of California. We find clear but heterogeneous geographical dependence. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

16 pages, 439 KiB  
Article
Stochastic Volatility Models with Skewness Selection
by Igor Martins and Hedibert Freitas Lopes
Entropy 2024, 26(2), 142; https://doi.org/10.3390/e26020142 - 6 Feb 2024
Viewed by 1841
Abstract
This paper expands traditional stochastic volatility models by allowing for time-varying skewness without imposing it. While dynamic asymmetry may capture the likely direction of future asset returns, it comes at the risk of leading to overparameterization. Our proposed approach mitigates this concern by [...] Read more.
This paper expands traditional stochastic volatility models by allowing for time-varying skewness without imposing it. While dynamic asymmetry may capture the likely direction of future asset returns, it comes at the risk of leading to overparameterization. Our proposed approach mitigates this concern by leveraging sparsity-inducing priors to automatically select the skewness parameter as dynamic, static or zero in a data-driven framework. We consider two empirical applications. First, in a bond yield application, dynamic skewness captures interest rate cycles of monetary easing and tightening and is partially explained by central banks’ mandates. In a currency modeling framework, our model indicates no skewness in the carry factor after accounting for stochastic volatility. This supports the idea of carry crashes resulting from volatility surges instead of dynamic skewness. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

24 pages, 394 KiB  
Article
On the Nuisance Parameter Elimination Principle in Hypothesis Testing
by Andrés Felipe Flórez Rivera, Luis Gustavo Esteves, Victor Fossaluza and Carlos Alberto de Bragança Pereira
Entropy 2024, 26(2), 117; https://doi.org/10.3390/e26020117 - 29 Jan 2024
Viewed by 1079
Abstract
The Non-Informative Nuisance Parameter Principle concerns the problem of how inferences about a parameter of interest should be made in the presence of nuisance parameters. The principle is examined in the context of the hypothesis testing problem. We prove that the mixed test [...] Read more.
The Non-Informative Nuisance Parameter Principle concerns the problem of how inferences about a parameter of interest should be made in the presence of nuisance parameters. The principle is examined in the context of the hypothesis testing problem. We prove that the mixed test obeys the principle for discrete sample spaces. We also show how adherence of the mixed test to the principle can make performance of the test much easier. These findings are illustrated with new solutions to well-known problems of testing hypotheses for count data. Full article
(This article belongs to the Special Issue Bayesianism)
25 pages, 1048 KiB  
Article
An Objective and Robust Bayes Factor for the Hypothesis Test One Sample and Two Population Means
by Israel A. Almodóvar-Rivera and Luis R. Pericchi-Guerra
Entropy 2024, 26(1), 88; https://doi.org/10.3390/e26010088 - 20 Jan 2024
Viewed by 1732
Abstract
It has been over 100 years since the discovery of one of the most fundamental statistical tests: the Student’s t test. However, reliable conventional and objective Bayesian procedures are still essential for routine practice. In this work, we proposed an objective and robust [...] Read more.
It has been over 100 years since the discovery of one of the most fundamental statistical tests: the Student’s t test. However, reliable conventional and objective Bayesian procedures are still essential for routine practice. In this work, we proposed an objective and robust Bayesian approach for hypothesis testing for one-sample and two-sample mean comparisons when the assumption of equal variances holds. The newly proposed Bayes factors are based on the intrinsic and Berger robust prior. Additionally, we introduced a corrected version of the Bayesian Information Criterion (BIC), denoted BIC-TESS, which is based on the effective sample size (TESS), for comparing two population means. We studied our developed Bayes factors in several simulation experiments for hypothesis testing. Our methodologies consistently provided strong evidence in favor of the null hypothesis in the case of equal means and variances. Finally, we applied the methodology to the original Gosset sleep data, concluding strong evidence favoring the hypothesis that the average sleep hours differed between the two treatments. These methodologies exhibit finite sample consistency and demonstrate consistent qualitative behavior, proving reasonably close to each other in practice, particularly for moderate to large sample sizes. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

13 pages, 283 KiB  
Article
Linear Bayesian Estimation of Misrecorded Poisson Distribution
by Huiqing Gao, Zhanshou Chen and Fuxiao Li
Entropy 2024, 26(1), 62; https://doi.org/10.3390/e26010062 - 11 Jan 2024
Viewed by 1334
Abstract
Parameter estimation is an important component of statistical inference, and how to improve the accuracy of parameter estimation is a key issue in research. This paper proposes a linear Bayesian estimation for estimating parameters in a misrecorded Poisson distribution. The linear Bayesian estimation [...] Read more.
Parameter estimation is an important component of statistical inference, and how to improve the accuracy of parameter estimation is a key issue in research. This paper proposes a linear Bayesian estimation for estimating parameters in a misrecorded Poisson distribution. The linear Bayesian estimation method not only adopts prior information but also avoids the cumbersome calculation of posterior expectations. On the premise of ensuring the accuracy and stability of computational results, we derived the explicit solution of the linear Bayesian estimation. Its superiority was verified through numerical simulations and illustrative examples. Full article
(This article belongs to the Special Issue Bayesianism)
15 pages, 1540 KiB  
Article
Objective Priors for Invariant e-Values in the Presence of Nuisance Parameters
by Elena Bortolato and Laura Ventura
Entropy 2024, 26(1), 58; https://doi.org/10.3390/e26010058 - 9 Jan 2024
Viewed by 1268
Abstract
This paper aims to contribute to refining the e-values for testing precise hypotheses, especially when dealing with nuisance parameters, leveraging the effectiveness of asymptotic expansions of the posterior. The proposed approach offers the advantage of bypassing the need for elicitation of priors [...] Read more.
This paper aims to contribute to refining the e-values for testing precise hypotheses, especially when dealing with nuisance parameters, leveraging the effectiveness of asymptotic expansions of the posterior. The proposed approach offers the advantage of bypassing the need for elicitation of priors and reference functions for the nuisance parameters and the multidimensional integration step. For this purpose, starting from a Laplace approximation, a posterior distribution for the parameter of interest is only considered and then a suitable objective matching prior is introduced, ensuring that the posterior mode aligns with an equivariant frequentist estimator. Consequently, both Highest Probability Density credible sets and the e-value remain invariant. Some targeted and challenging examples are discussed. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

12 pages, 1080 KiB  
Article
Probabilistic Nearest Neighbors Classification
by Bruno Fava, Paulo C. Marques F. and Hedibert F. Lopes
Entropy 2024, 26(1), 39; https://doi.org/10.3390/e26010039 - 30 Dec 2023
Viewed by 1551
Abstract
Analysis of the currently established Bayesian nearest neighbors classification model points to a connection between the computation of its normalizing constant and issues of NP-completeness. An alternative predictive model constructed by aggregating the predictive distributions of simpler nonlocal models is proposed, and analytic [...] Read more.
Analysis of the currently established Bayesian nearest neighbors classification model points to a connection between the computation of its normalizing constant and issues of NP-completeness. An alternative predictive model constructed by aggregating the predictive distributions of simpler nonlocal models is proposed, and analytic expressions for the normalizing constants of these nonlocal models are derived, ensuring polynomial time computation without approximations. Experiments with synthetic and real datasets showcase the predictive performance of the proposed predictive model. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

Other

Jump to: Research

13 pages, 224 KiB  
Perspective
Statistics as a Social Activity: Attitudes toward Amalgamating Evidence
by Andrew Gelman and Keith O’Rourke
Entropy 2024, 26(8), 652; https://doi.org/10.3390/e26080652 - 30 Jul 2024
Viewed by 888
Abstract
Amalgamation of evidence in statistics is conducted in several ways. Within a study, multiple observations are combined by averaging, or as factors in a likelihood or prediction algorithm. In multilevel modeling or Bayesian analysis, population or prior information is combined with data using [...] Read more.
Amalgamation of evidence in statistics is conducted in several ways. Within a study, multiple observations are combined by averaging, or as factors in a likelihood or prediction algorithm. In multilevel modeling or Bayesian analysis, population or prior information is combined with data using the weighted averaging derived from probability modeling. In a scientific research project, inferences from data analysis are interpreted in light of mechanistic models and substantive theories. Within a scholarly or applied research community, data and conclusions from separate laboratories are amalgamated through a series of steps, including peer review, meta-analysis, review articles, and replication studies. These issues have been discussed for many years in the philosophy of science and statistics, gaining attention in recent decades first with the renewed popularity of Bayesian inference and then with concerns about the replication crisis in science. In this article, we review the amalgamation of statistical evidence from different perspectives, connecting the foundations of statistics to the social processes of validation, criticism, and consensus building. Full article
(This article belongs to the Special Issue Bayesianism)
Back to TopTop