Advances in Statistical Methods with Applications

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Probability and Statistics".

Deadline for manuscript submissions: 30 April 2025 | Viewed by 3886

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Business & Economics, United Arab Emirates University, Al Ain 17555, United Arab Emirates
Interests: statistical theory; decision theory; Bayesian analysis; reliability theory; regression modeling

E-Mail Website
Guest Editor
Department of Statistics and Operations Research, Kuwait University, Al-Shadadiyya, Kuwait City 12037, Kuwait
Interests: statistical analysis; Bayesian; probability distributions; ordered data; biostatistical applications; reliability studies; intensive computation and simulation

Special Issue Information

Dear Colleagues,

This Special Issue on "Advances in Statistical Methods with Applications" aims to highlight recent advancements in statistical methods and their practical applications across various fields. Recognizing the vital role of statistics in modern research, this Special Issue brings together recent investigations, methodologies, and case studies to explore novel statistical approaches and their implementation.

The articles featured in this Special Issue cover a wide range of topics, including, but not limited to, the following:

  1. Bayesian statistics and its applications;
  2. Machine learning techniques in statistical modeling;
  3. Multivariate analysis and its applications;
  4. Time series analysis and forecasting;
  5. Experimental design and optimization methods;
  6. Spatial statistics and spatial modeling;
  7. Regression analysis and modeling;
  8. Big data analytics and statistical inference;
  9. Decision theory and its applications;
  10. Reliability theory and its applications;
  11. Entropy and information studies.

Prof. Dr. Mohamed T. Madi
Prof. Dr. Mohammad Z. Raqab
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • big data and analysis
  • machine learning
  • mathematical modelling
  • stochastic modeling
  • Monte Carlo simulations
  • measures of divergence and entropy
  • optimization models
  • Bayesian analysis
  • biostatistical applications
  • entropy and information measures

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 1284 KiB  
Article
Context-Dependent Criteria for Dirichlet Process in Sequential Decision-Making Problems
by Ksenia Kasianova and Mark Kelbert
Mathematics 2024, 12(21), 3321; https://doi.org/10.3390/math12213321 - 23 Oct 2024
Viewed by 539
Abstract
In models with insufficient initial information, parameter estimation can be subject to statistical uncertainty, potentially resulting in suboptimal decision-making; however, delaying implementation to gather more information can also incur costs. This paper examines an extension of information-theoretic approaches designed to address this classical [...] Read more.
In models with insufficient initial information, parameter estimation can be subject to statistical uncertainty, potentially resulting in suboptimal decision-making; however, delaying implementation to gather more information can also incur costs. This paper examines an extension of information-theoretic approaches designed to address this classical dilemma, focusing on balancing the expected profits and the information needed to be obtained about all of the possible outcomes. Initially utilized in binary outcome scenarios, these methods leverage information measures to harmonize competing objectives efficiently. Building upon the foundations laid by existing research, this methodology is expanded to encompass experiments with multiple outcome categories using Dirichlet processes. The core of our approach is centered around weighted entropy measures, particularly in scenarios dictated by Dirichlet distributions, which have not been extensively explored previously. We innovatively adapt the technique initially applied to binary case to Dirichlet distributions/processes. The primary contribution of our work is the formulation of a sequential minimization strategy for the main term of an asymptotic expansion of differential entropy, which scales with sample size, for non-binary outcomes. This paper provides a theoretical grounding, extended empirical applications, and comprehensive proofs, setting a robust framework for further interdisciplinary applications of information-theoretic paradigms in sequential decision-making. Full article
(This article belongs to the Special Issue Advances in Statistical Methods with Applications)
Show Figures

Figure 1

17 pages, 1592 KiB  
Article
An Enhanced Tree Ensemble for Classification in the Presence of Extreme Class Imbalance
by Samir K. Safi and Sheema Gul
Mathematics 2024, 12(20), 3243; https://doi.org/10.3390/math12203243 - 16 Oct 2024
Viewed by 879
Abstract
Researchers using machine learning methods for classification can face challenges due to class imbalance, where a certain class is underrepresented. Over or under-sampling of minority or majority class observations, or solely relying on model selection for ensemble methods, may prove ineffective when the [...] Read more.
Researchers using machine learning methods for classification can face challenges due to class imbalance, where a certain class is underrepresented. Over or under-sampling of minority or majority class observations, or solely relying on model selection for ensemble methods, may prove ineffective when the class imbalance ratio is extremely high. To address this issue, this paper proposes a method called enhance tree ensemble (ETE), based on generating synthetic data for minority class observations in conjunction with tree selection based on their performance on the training data. The proposed method first generates minority class instances to balance the training data and then uses the idea of tree selection by leveraging out-of-bag (ETEOOB) and sub-samples (ETESS) observations, respectively. The efficacy of the proposed method is assessed using twenty benchmark problems for binary classification with moderate to extreme class imbalance, comparing it against other well-known methods such as optimal tree ensemble (OTE), SMOTE random forest (RFSMOTE), oversampling random forest (RFOS), under-sampling random forest (RFUS), k-nearest neighbor (k-NN), support vector machine (SVM), tree, and artificial neural network (ANN). Performance metrics such as classification error rate and precision are used for evaluation purposes. The analyses of the study revealed that the proposed method, based on data balancing and model selection, yielded better results than the other methods. Full article
(This article belongs to the Special Issue Advances in Statistical Methods with Applications)
Show Figures

Figure 1

21 pages, 314 KiB  
Article
Analyzing Interval-Censored Recurrence Event Data with Adjusting Informative Observation Times by Propensity Scores
by Ni Li, Meiting Lin and Yakun Shang
Mathematics 2024, 12(12), 1887; https://doi.org/10.3390/math12121887 - 18 Jun 2024
Viewed by 724
Abstract
In this paper, we discuss the statistical inference of interval-censored recurrence event data under an informative observation process. We establish an additive semiparametric mean model for the recurrence event process. Since the observation process may contain relevant information about potential underlying recurrence event [...] Read more.
In this paper, we discuss the statistical inference of interval-censored recurrence event data under an informative observation process. We establish an additive semiparametric mean model for the recurrence event process. Since the observation process may contain relevant information about potential underlying recurrence event processes, which leads to confounding bias, therefore, we introduced a propensity score into the additive semiparametric mean model to adjust for confounding bias, which possibly exists. Furthermore, the estimation equations were used to estimate the parameters of the covariate effects, and the asymptotic normality of the estimator under large samples is proven. Through simulation studies, we illustrated that the proposed method works well, and it was applied to the analysis of bladder cancer data. Full article
(This article belongs to the Special Issue Advances in Statistical Methods with Applications)
13 pages, 290 KiB  
Article
Attribute Sampling Plan for Submitted Lots Based on Prior Information and Bayesian Approach
by Jing Zhao, Fengyun Zhang, Xuan Zhang, Yuping Hu and Wenxing Ding
Mathematics 2024, 12(11), 1692; https://doi.org/10.3390/math12111692 - 29 May 2024
Viewed by 855
Abstract
An acceptance sampling plan is a method used to make a decision about acceptance or rejection of a product based on adherence to a standard. Meanwhile, prior information, such as the process capability index (PCI), has been applied in different manufacturing industries to [...] Read more.
An acceptance sampling plan is a method used to make a decision about acceptance or rejection of a product based on adherence to a standard. Meanwhile, prior information, such as the process capability index (PCI), has been applied in different manufacturing industries to improve the quality of manufacturing processes and the quality inspection of products. In this paper, an attribute sampling plan is developed for submitted lots based on prior information and Bayesian approach. The new attribute sampling plans adjust sample sizes to prior information based on the status of the inspection target. To be specific, the sampling plans in this paper are indexed by the parameter trust with levels of low, medium, and high, where increasing trust level reduces sample size or risk. PCIs are an important basis for the choice of the trust level. In addition, multiple comparisons have been performed, including producer’s risk and consumer’s risk under different prior information parameters and different sample sizes. Full article
(This article belongs to the Special Issue Advances in Statistical Methods with Applications)
Show Figures

Figure 1

Back to TopTop