Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (152)

Search Parameters:
Keywords = Metropolis-Hastings

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
31 pages, 521 KB  
Article
Bayesian Analysis of Nonlinear Quantile Structural Equation Model with Possible Non-Ignorable Missingness
by Lu Zhang and Mulati Tuerde
Mathematics 2025, 13(19), 3094; https://doi.org/10.3390/math13193094 - 26 Sep 2025
Viewed by 188
Abstract
This paper develops a nonlinear quantile structural equation model via the Bayesian approach, aiming to more accurately analyze the relationships between latent variables, with special attention paid to the issue of non-ignorable missing data in the model. The model not only incorporates quantile [...] Read more.
This paper develops a nonlinear quantile structural equation model via the Bayesian approach, aiming to more accurately analyze the relationships between latent variables, with special attention paid to the issue of non-ignorable missing data in the model. The model not only incorporates quantile regression to examine the relationships between latent variables at different quantile levels but also features a specially designed mechanism for handling missing data. The non-ignorable missing mechanism is specified through a logistic regression model, and a combined method of Gibbs sampling and Metropolis–Hastings sampling is adopted for missing value imputation, while simultaneously estimating unknown parameters, latent variables, and parameters in the missing data model. To verify the effectiveness of the proposed method, simulation studies are conducted under conditions of different sample sizes and missing rates. The results of these simulation studies indicate that the developed method performs excellently in handling complex data structures and missing data. Furthermore, this paper demonstrates the practical application value of the nonlinear quantile structural equation model through a case study on the growth of listed companies, providing researchers in related fields with a new analytical tool. Full article
(This article belongs to the Special Issue Research on Dynamical Systems and Differential Equations, 2nd Edition)
Show Figures

Figure 1

14 pages, 1009 KB  
Article
A Bayesian ARMA Probability Density Estimator
by Jeffrey D. Hart
Entropy 2025, 27(10), 1001; https://doi.org/10.3390/e27101001 - 26 Sep 2025
Viewed by 206
Abstract
A Bayesian approach for constructing ARMA probability density estimators is proposed. Such estimators are ratios of trigonometric polynomials and have a number of advantages over Fourier series estimators, including parsimony and greater efficiency under common conditions. The Bayesian approach is carried out via [...] Read more.
A Bayesian approach for constructing ARMA probability density estimators is proposed. Such estimators are ratios of trigonometric polynomials and have a number of advantages over Fourier series estimators, including parsimony and greater efficiency under common conditions. The Bayesian approach is carried out via MCMC, the output of which can be used to obtain probability intervals for unknown parameters and the underlying density. Finite sample efficiency and methods for choosing the estimator’s smoothing parameter are considered in a simulation study, and the ideas are illustrated with data on a wine attribute. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

10 pages, 790 KB  
Proceeding Paper
A Comparison of MCMC Algorithms for an Inverse Squeeze Flow Problem
by Aricia Rinkens, Rodrigo L. S. Silva, Clemens V. Verhoosel, Nick O. Jaensson and Erik Quaeghebeur
Phys. Sci. Forum 2025, 12(1), 4; https://doi.org/10.3390/psf2025012004 - 22 Sep 2025
Viewed by 155
Abstract
Using Bayesian inference to calibrate constitutive model parameters has recently seen a rise in interest. The Markov chain Monte Carlo (MCMC) algorithm is one of the most commonly used methods to sample from the posterior. However, the choice of which MCMC algorithm to [...] Read more.
Using Bayesian inference to calibrate constitutive model parameters has recently seen a rise in interest. The Markov chain Monte Carlo (MCMC) algorithm is one of the most commonly used methods to sample from the posterior. However, the choice of which MCMC algorithm to apply is typically pragmatic and based on considerations such as software availability and experience. We compare three commonly used MCMC algorithms: Metropolis-Hastings (MH), Affine Invariant Stretch Move (AISM) and No-U-Turn sampler (NUTS). For the comparison, we use the Kullback-Leibler (KL) divergence as a convergence criterion, which measures the statistical distance between the sampled and the ‘true’ posterior. We apply the Bayesian framework to a Newtonian squeeze flow problem, for which there exists an analytical model. Furthermore, we have collected experimental data using a tailored setup. The ground truth for the posterior is obtained by evaluating it on a uniform reference grid. We conclude that, for the same number of samples, the NUTS results in the lowest KL divergence, followed by the AISM sampler and last the MH sampler. Full article
Show Figures

Figure 1

31 pages, 12350 KB  
Article
Statistical Evaluation of Beta-Binomial Probability Law for Removal in Progressive First-Failure Censoring and Its Applications to Three Cancer Cases
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Mathematics 2025, 13(18), 3028; https://doi.org/10.3390/math13183028 - 19 Sep 2025
Viewed by 285
Abstract
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation [...] Read more.
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation to the exponential baseline, the proposed model introduces an additional flexibility parameter that enriches the family of lifetime distributions, enabling it to better capture varying failure rates and diverse hazard rate behaviors commonly observed in biomedical data, thus extending the classical exponential model. This study develops a novel computational framework for analyzing an α-powered exponential model under beta-binomial random removals within the proposed censoring test. To address the inherent complexity of the likelihood function arising from simultaneous random removals and progressive censoring, we derive closed-form expressions for the likelihood, survival, and hazard functions and propose efficient estimation strategies based on both maximum likelihood and Bayesian inference. For the Bayesian approach, gamma and beta priors are adopted, and a tailored Metropolis–Hastings algorithm is implemented to approximate posterior distributions under symmetric and asymmetric loss functions. To evaluate the empirical performance of the proposed estimators, extensive Monte Carlo simulations are conducted, examining bias, mean squared error, and credible interval coverage under varying censoring levels and removal probabilities. Furthermore, the practical utility of the model is illustrated through three oncological datasets, including multiple myeloma, lung cancer, and breast cancer patients, demonstrating superior goodness of fit and predictive reliability compared to traditional models. The results show that the proposed lifespan model, under the beta-binomial probability law and within the examined censoring mechanism, offers a flexible and computationally tractable framework for reliability and biomedical survival analysis, providing new insights into censored data structures with random withdrawals. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
Show Figures

Figure 1

29 pages, 19296 KB  
Article
Inference for the Chris–Jerry Lifetime Distribution Under Improved Adaptive Progressive Type-II Censoring for Physics and Engineering Data Modelling
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(9), 702; https://doi.org/10.3390/axioms14090702 - 17 Sep 2025
Viewed by 241
Abstract
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater [...] Read more.
This paper presents a comprehensive reliability analysis framework for the Chris–Jerry (CJ) lifetime distribution under an improved adaptive progressive Type-II censoring plan. The CJ model, recently introduced to capture skewed lifetime behaviors, is studied under a modified censoring structure designed to provide greater flexibility in terminating life-testing experiments. We derive maximum likelihood estimators for the CJ parameters and key reliability measures, including the reliability and hazard rate functions, and construct approximate confidence intervals using the observed Fisher information matrix and the delta method. To address the intractability of the likelihood function, Bayesian estimators are obtained under independent gamma priors and a squared-error loss function. Because the posterior distributions are not available in closed form, we apply the Metropolis–Hastings algorithm to generate Bayesian estimates and two types of credible intervals. A comprehensive simulation study evaluates the performance of the proposed estimation techniques under various censoring scenarios. The framework is further validated through two real-world datasets: one involving rainfall measurements and another concerning mechanical failure times. In both cases, the CJ model combined with the proposed censoring strategy demonstrates superior fit and reliability inference compared to competing models. These findings highlight the value of the CJ distribution, together with advanced censoring methods, for modeling lifetime data in physics and engineering applications. Full article
Show Figures

Figure 1

34 pages, 31211 KB  
Article
Statistical Evaluation of Alpha-Powering Exponential Generalized Progressive Hybrid Censoring and Its Modeling for Medical and Engineering Sciences with Optimization Plans
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Symmetry 2025, 17(9), 1473; https://doi.org/10.3390/sym17091473 - 6 Sep 2025
Viewed by 518
Abstract
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, [...] Read more.
This study explores advanced methods for analyzing the two-parameter alpha-power exponential (APE) distribution using data from a novel generalized progressive hybrid censoring scheme. The APE model is inherently asymmetric, exhibiting positive skewness across all valid parameter values due to its right-skewed exponential base, with the alpha-power transformation amplifying or dampening this skewness depending on the power parameter. The proposed censoring design offers new insights into modeling lifetime data that exhibit non-monotonic hazard behaviors. It enhances testing efficiency by simultaneously imposing fixed-time constraints and ensuring a minimum number of failures, thereby improving inference quality over traditional censoring methods. We derive maximum likelihood and Bayesian estimates for the APE distribution parameters and key reliability measures, such as the reliability and hazard rate functions. Bayesian analysis is performed using independent gamma priors under a symmetric squared error loss, implemented via the Metropolis–Hastings algorithm. Interval estimation is addressed using two normality-based asymptotic confidence intervals and two credible intervals obtained through a simulated Markov Chain Monte Carlo procedure. Monte Carlo simulations across various censoring scenarios demonstrate the stable and superior precision of the proposed methods. Optimal censoring patterns are identified based on the observed Fisher information and its inverse. Two real-world case studies—breast cancer remission times and global oil reserve data—illustrate the practical utility of the APE model within the proposed censoring framework. These applications underscore the model’s capability to effectively analyze diverse reliability phenomena, bridging theoretical innovation with empirical relevance in lifetime data analysis. Full article
(This article belongs to the Special Issue Unlocking the Power of Probability and Statistics for Symmetry)
Show Figures

Figure 1

54 pages, 22294 KB  
Article
Research on Risk Evolution Probability of Urban Lifeline Natech Events Based on MdC-MCMC
by Shifeng Li and Yu Shang
Sustainability 2025, 17(17), 7664; https://doi.org/10.3390/su17177664 - 25 Aug 2025
Viewed by 884
Abstract
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, [...] Read more.
Urban lifeline Natech events are coupled systems composed of multiple risks and entities with complex dynamic transmission chains. Predicting risk evolution probabilities is the core task for achieving the safety management of urban lifeline Natech events. First, the risk evolution mechanism is analyzed, where urban lifeline Natech events exhibit spatial evolution characteristics, which involves dissecting the parallel and synergistic effects of risk evolution in spatial dimensions. Next, based on fitting marginal probability distribution functions for natural hazard and urban lifeline risk evolution, a Multi-dimensional Copula (MdC) function for the joint probability distribution of urban lifeline Natech event risk evolution is constructed. Building upon the MdC function, a Markov Chain Monte Carlo (MCMC) model for predicting risk evolution probabilities of urban lifeline Natech events is developed using the Metropolis–Hastings (M-H) algorithm and Gibbs sampling. Finally, taking the 2021 Zhengzhou ‘7·20’ catastrophic rainstorm as a case study, joint probability distribution functions for risk evolution under Rainfall-Wind speed scenarios are fitted for traffic, electric, communication, water supply, and drainage systems (including different risk transmission chains). Numerical simulations of joint probability distributions for risk evolution are conducted, and visualizations of joint probability predictions for risk evolution are generated. Full article
Show Figures

Figure 1

33 pages, 6324 KB  
Article
The Inverted Hjorth Distribution and Its Applications in Environmental and Pharmaceutical Sciences
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Symmetry 2025, 17(8), 1327; https://doi.org/10.3390/sym17081327 - 14 Aug 2025
Viewed by 430
Abstract
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing [...] Read more.
This study introduces an inverted version of the three-parameter Hjorth lifespan model, characterized by one scale parameter and two shape parameters, referred to as the inverted Hjorth (IH) distribution. This asymmetric distribution can fit various positively skewed datasets more accurately than several existing models in the literature, as it can accommodate data exhibiting an inverted (upside-down) bathtub-shaped hazard rate. We derive key properties of the model, including quantiles, moments, reliability measures, stress–strength reliability, and order statistics. Point estimation of the IH model parameters is performed using maximum likelihood and Bayesian approaches. Moreover, for interval estimation, two types of asymptotic confidence intervals and two types of Bayesian credible intervals are obtained using the same estimation methodologies. As an extension to a complete sampling plan, Type-II censoring is employed to examine the impact of data incompleteness on IH parameter estimation. Monte Carlo simulation results indicate that Bayesian point and credible estimates outperform those obtained via classical estimation methods across several precision metrics, including mean squared error, average absolute bias, average interval length, and coverage probability. To further assess its performance, two real datasets are analyzed: one from the environmental domain (minimum monthly water flows of the Piracicaba River) and another from the pharmacological domain (plasma indomethacin concentrations). The superiority and flexibility of the inverted Hjorth model are evaluated and compared with several competing models. The results confirm that the IH distribution provides a better fit than several existing lifetime models—such as the inverted Gompertz, inverted log-logistic, inverted Lomax, and inverted Nadarajah–Haghighi distributions—making it a valuable tool for reliability and survival data analysis. Full article
Show Figures

Figure 1

16 pages, 882 KB  
Article
MatBYIB: A MATLAB-Based Toolkit for Parameter Estimation of Eccentric Gravitational Waves from EMRIs
by Genliang Li, Shujie Zhao, Huaike Guo, Jingyu Su and Zhenheng Lin
Universe 2025, 11(8), 259; https://doi.org/10.3390/universe11080259 - 6 Aug 2025
Viewed by 382
Abstract
Accurate parameter estimation is essential for gravitational wave data analysis. In extreme mass-ratio inspiral binary systems, orbital eccentricity is a critical parameter for parameter estimation. However, the current software for the parameter estimation of the gravitational wave often neglects the direct estimation of [...] Read more.
Accurate parameter estimation is essential for gravitational wave data analysis. In extreme mass-ratio inspiral binary systems, orbital eccentricity is a critical parameter for parameter estimation. However, the current software for the parameter estimation of the gravitational wave often neglects the direct estimation of orbital eccentricity. To fill this gap, we have developed the MatBYIB, a MATLAB-based software (Version 1.0) package for the parameter estimation of the gravitational wave with arbitrary eccentricity. The MatBYIB employs the Analytical Kludge waveform as a computationally efficient signal generator and computes parameter uncertainties via the Fisher Information Matrix and the Markov Chain Monte Carlo. For Bayesian inference, we implement the Metropolis–Hastings algorithm to derive posterior distributions. To guarantee convergence, the Gelman–Rubin convergence criterion (the Potential Scale Reduction Factor R^) is used to determine sampling adequacy, with MatBYIB dynamically increasing the sample size until R^<1.05 for all parameters. Our results demonstrate strong agreement between predictions based on the Fisher Information Matrix and full MCMC sampling. This program is user-friendly and allows for the estimation of the gravitational wave parameters with arbitrary eccentricity on standard personal computers. Full article
Show Figures

Figure 1

23 pages, 3124 KB  
Article
Bee Swarm Metropolis–Hastings Sampling for Bayesian Inference in the Ginzburg–Landau Equation
by Shucan Xia and Lipu Zhang
Algorithms 2025, 18(8), 476; https://doi.org/10.3390/a18080476 - 2 Aug 2025
Viewed by 419
Abstract
To improve the sampling efficiency of Markov Chain Monte Carlo in complex parameter spaces, this paper proposes an adaptive sampling method that integrates a swarm intelligence mechanism called the BeeSwarm-MH algorithm. The method combines global exploration by scout bees with local exploitation by [...] Read more.
To improve the sampling efficiency of Markov Chain Monte Carlo in complex parameter spaces, this paper proposes an adaptive sampling method that integrates a swarm intelligence mechanism called the BeeSwarm-MH algorithm. The method combines global exploration by scout bees with local exploitation by worker bees. It employs multi-stage perturbation intensities and adaptive step-size tuning to enable efficient posterior sampling. Focusing on Bayesian inference for parameter estimation in the soliton solutions of the two-dimensional complex Ginzburg–Landau equation, we design a dedicated inference framework to systematically compare the performance of BeeSwarm-MH with the classical Metropolis–Hastings algorithm. Experimental results demonstrate that BeeSwarm-MH achieves comparable estimation accuracy while significantly reducing the required number of iterations and total computation time for convergence. Moreover, it exhibits superior global search capabilities and adaptive features, offering a practical approach for efficient Bayesian inference in complex physical models. Full article
Show Figures

Graphical abstract

35 pages, 11039 KB  
Article
Optimum Progressive Data Analysis and Bayesian Inference for Unified Progressive Hybrid INH Censoring with Applications to Diamonds and Gold
by Heba S. Mohammed, Osama E. Abo-Kasem and Ahmed Elshahhat
Axioms 2025, 14(8), 559; https://doi.org/10.3390/axioms14080559 - 23 Jul 2025
Viewed by 356
Abstract
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for [...] Read more.
A novel unified progressive hybrid censoring is introduced to combine both progressive and hybrid censoring plans to allow flexible test termination either after a prespecified number of failures or at a fixed time. This work develops both frequentist and Bayesian inferential procedures for estimating the parameters, reliability, and hazard rates of the inverted Nadarajah–Haghighi lifespan model when a sample is produced from such a censoring plan. Maximum likelihood estimators are obtained through the Newton–Raphson iterative technique. The delta method, based on the Fisher information matrix, is utilized to build the asymptotic confidence intervals for each unknown quantity. In the Bayesian methodology, Markov chain Monte Carlo techniques with independent gamma priors are implemented to generate posterior summaries and credible intervals, addressing computational intractability through the Metropolis—Hastings algorithm. Extensive Monte Carlo simulations compare the efficiency and utility of frequentist and Bayesian estimates across multiple censoring designs, highlighting the superiority of Bayesian inference using informative prior information. Two real-world applications utilizing rare minerals from gold and diamond durability studies are examined to demonstrate the adaptability of the proposed estimators to the analysis of rare events in precious materials science. By applying four different optimality criteria to multiple competing plans, an analysis of various progressive censoring strategies that yield the best performance is conducted. The proposed censoring framework is effectively applied to real-world datasets involving diamonds and gold, demonstrating its practical utility in modeling the reliability and failure behavior of rare and high-value minerals. Full article
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)
Show Figures

Figure 1

20 pages, 774 KB  
Article
Robust Variable Selection via Bayesian LASSO-Composite Quantile Regression with Empirical Likelihood: A Hybrid Sampling Approach
by Ruisi Nan, Jingwei Wang, Hanfang Li and Youxi Luo
Mathematics 2025, 13(14), 2287; https://doi.org/10.3390/math13142287 - 16 Jul 2025
Viewed by 553
Abstract
Since the advent of composite quantile regression (CQR), its inherent robustness has established it as a pivotal methodology for high-dimensional data analysis. High-dimensional outlier contamination refers to data scenarios where the number of observed dimensions (p) is much greater than the [...] Read more.
Since the advent of composite quantile regression (CQR), its inherent robustness has established it as a pivotal methodology for high-dimensional data analysis. High-dimensional outlier contamination refers to data scenarios where the number of observed dimensions (p) is much greater than the sample size (n) and there are extreme outliers in the response variables or covariates (e.g., p/n > 0.1). Traditional penalized regression techniques, however, exhibit notable vulnerability to data outliers during high-dimensional variable selection, often leading to biased parameter estimates and compromised resilience. To address this critical limitation, we propose a novel empirical likelihood (EL)-based variable selection framework that integrates a Bayesian LASSO penalty within the composite quantile regression framework. By constructing a hybrid sampling mechanism that incorporates the Expectation–Maximization (EM) algorithm and Metropolis–Hastings (M-H) algorithm within the Gibbs sampling scheme, this approach effectively tackles variable selection in high-dimensional settings with outlier contamination. This innovative design enables simultaneous optimization of regression coefficients and penalty parameters, circumventing the need for ad hoc selection of optimal penalty parameters—a long-standing challenge in conventional LASSO estimation. Moreover, the proposed method imposes no restrictive assumptions on the distribution of random errors in the model. Through Monte Carlo simulations under outlier interference and empirical analysis of two U.S. house price datasets, we demonstrate that the new approach significantly enhances variable selection accuracy, reduces estimation bias for key regression coefficients, and exhibits robust resistance to data outlier contamination. Full article
Show Figures

Figure 1

24 pages, 2253 KB  
Article
Modeling Spatial Data with Heteroscedasticity Using PLVCSAR Model: A Bayesian Quantile Regression Approach
by Rongshang Chen and Zhiyong Chen
Entropy 2025, 27(7), 715; https://doi.org/10.3390/e27070715 - 1 Jul 2025
Viewed by 473
Abstract
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model [...] Read more.
Spatial data not only enables smart cities to visualize, analyze, and interpret data related to location and space, but also helps departments make more informed decisions. We apply a Bayesian quantile regression (BQR) of the partially linear varying coefficient spatial autoregressive (PLVCSAR) model for spatial data to improve the prediction of performance. It can be used to capture the response of covariates to linear and nonlinear effects at different quantile points. Through an approximation of the nonparametric functions with free-knot splines, we develop a Bayesian sampling approach that can be applied by the Markov chain Monte Carlo (MCMC) approach and design an efficient Metropolis–Hastings within the Gibbs sampling algorithm to explore the joint posterior distributions. Computational efficiency is achieved through a modified reversible-jump MCMC algorithm incorporating adaptive movement steps to accelerate chain convergence. The simulation results demonstrate that our estimator exhibits robustness to alternative spatial weight matrices and outperforms both quantile regression (QR) and instrumental variable quantile regression (IVQR) in a finite sample at different quantiles. The effectiveness of the proposed model and estimation method is demonstrated by the use of real data from the Boston median house price. Full article
(This article belongs to the Special Issue Bayesian Hierarchical Models with Applications)
Show Figures

Figure 1

12 pages, 5618 KB  
Article
An Algorithm for the Conditional Distribution of Independent Binomial Random Variables Given the Sum
by Kelly Ayres and Steven E. Rigdon
Mathematics 2025, 13(13), 2155; https://doi.org/10.3390/math13132155 - 30 Jun 2025
Viewed by 648
Abstract
We investigate Metropolis–Hastings (MH) algorithms to approximate the distribution of independent binomial random variables conditioned on the sum. Let XiBIN(ni,pi). We want the distribution of [...] Read more.
We investigate Metropolis–Hastings (MH) algorithms to approximate the distribution of independent binomial random variables conditioned on the sum. Let XiBIN(ni,pi). We want the distribution of [X1,,Xk] conditioned on X1++Xk=n. We propose both a random walk MH algorithm and an independence sampling MH algorithm for simulating from this conditional distribution. The acceptance probability in the MH algorithm always involves the probability mass function of the proposal distribution. For the random walk MH algorithm, we take this distribution to be uniform across all possible proposals. There is an inherent asymmetry; the number of moves from one state to another is not in general equal to the number of moves from the other state to the one. This requires a careful counting of the number of possible moves out of each possible state. The independence sampler proposes a move based on the Poisson approximation to the binomial. While in general, random walk MH algorithms tend to outperform independence samplers, we find that in this case the independence sampler is more efficient. Full article
Show Figures

Figure 1

16 pages, 616 KB  
Article
Bayesian Quantile Regression for Partial Functional Linear Spatial Autoregressive Model
by Dengke Xu, Shiqi Ke, Jun Dong and Ruiqin Tian
Axioms 2025, 14(6), 467; https://doi.org/10.3390/axioms14060467 - 16 Jun 2025
Viewed by 524
Abstract
When performing Bayesian modeling on functional data, the assumption of normality is often made on the model error and thus the results may be sensitive to outliers and/or heavy tailed data. An important and good choice for solving such problems is quantile regression. [...] Read more.
When performing Bayesian modeling on functional data, the assumption of normality is often made on the model error and thus the results may be sensitive to outliers and/or heavy tailed data. An important and good choice for solving such problems is quantile regression. Therefore, this paper introduces the quantile regression into the partial functional linear spatial autoregressive model (PFLSAM) based on the asymmetric Laplace distribution for the errors. Then, the idea of the functional principal component analysis, and the hybrid MCMC algorithm combining Gibbs sampling and the Metropolis–Hastings algorithm are developed to generate posterior samples from the full posterior distributions to obtain Bayesian estimation of unknown parameters and functional coefficients in the model. Finally, some simulation studies show that the proposed Bayesian estimation method is feasible and effective. Full article
Show Figures

Figure 1

Back to TopTop