Advances in High-Dimensional Data Analysis

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Probability and Statistics".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 1624

Special Issue Editors

School of Mathematics & Statistics, Central South University, Changsha 410083, China
Interests: statistical learning; survival analysis and data mining

E-Mail Website
Guest Editor
Department of Statistics, Business School, Qingdao University of Technology, Qingdao 266520, China
Interests: machine learning; spatial statistics; high-dimensional and complex data analysis; deep learning; kriging methods; manifold learning

E-Mail Website
Guest Editor
School of Mathematics & Statistics, Hunan Normal University, Changsha 410081, China
Interests: high dimensional analysis; machine learning; deep learning; survival analysis

Special Issue Information

Dear Colleagues,

High-dimensional data analysis has been an important focus within theoretical and applied statistics research for more than three decades, with applications areas including biostatistics, bioinformatics, chemistry, ecology, economy, and social sciences. The aim of this Special Issue is to collect research papers that use statistical (methodological, theoretical, or computational) principles for high-dimensional data analysis, as well as scalable optimization methods and applications in important real-world fields. 

In this Special Issue, we encourage original research submissions that provide new results in the setting of high-dimensional statistical inference and their applications. Review papers within all aspects of high-dimensional data analysis are also welcome.

Dr. Hong Wang
Dr. Liang Shen
Dr. Xuewei Cheng
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • high-dimensional inference
  • feature screening
  • variable selection
  • dimension reduction
  • high-dimensional statistical learning
  • machine learning for high-dimensional data
  • various applications of high-dimensional analysis approaches

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 1058 KiB  
Article
Large Sample Behavior of the Least Trimmed Squares Estimator
by Yijun Zuo
Mathematics 2024, 12(22), 3586; https://doi.org/10.3390/math12223586 - 15 Nov 2024
Viewed by 310
Abstract
The least trimmed squares (LTS) estimator is popular in location, regression, machine learning, and AI literature. Despite the empirical version of least trimmed squares (LTS) being repeatedly studied in the literature, the population version of the LTS has never been introduced and studied. [...] Read more.
The least trimmed squares (LTS) estimator is popular in location, regression, machine learning, and AI literature. Despite the empirical version of least trimmed squares (LTS) being repeatedly studied in the literature, the population version of the LTS has never been introduced and studied. The lack of the population version hinders the study of the large sample properties of the LTS utilizing the empirical process theory. Novel properties of the objective function in both empirical and population settings of the LTS and other properties, are established for the first time in this article. The primary properties of the objective function facilitate the establishment of other original results, including the influence function and Fisher consistency. The strong consistency is established with the help of a generalized Glivenko–Cantelli Theorem over a class of functions for the first time. Differentiability and stochastic equicontinuity promote the establishment of asymptotic normality with a concise and novel approach. Full article
(This article belongs to the Special Issue Advances in High-Dimensional Data Analysis)
Show Figures

Figure 1

16 pages, 3858 KiB  
Article
Element Aggregation for Estimation of High-Dimensional Covariance Matrices
by Jingying Yang
Mathematics 2024, 12(7), 1045; https://doi.org/10.3390/math12071045 - 30 Mar 2024
Viewed by 926
Abstract
This study addresses the challenge of estimating high-dimensional covariance matrices in financial markets, where traditional sparsity assumptions often fail due to the interdependence of stock returns across sectors. We present an innovative element-aggregation method that aggregates matrix entries to estimate covariance matrices. This [...] Read more.
This study addresses the challenge of estimating high-dimensional covariance matrices in financial markets, where traditional sparsity assumptions often fail due to the interdependence of stock returns across sectors. We present an innovative element-aggregation method that aggregates matrix entries to estimate covariance matrices. This method is designed to be applicable to both sparse and non-sparse matrices, transcending the limitations of sparsity-based approaches. The computational simplicity of the method’s implementation ensures low complexity, making it a practical tool for real-world applications. Theoretical analysis then confirms the method’s consistency and effectiveness with its convergence rate in specific scenarios. Additionally, numerical experiments validate the method’s superior algorithmic performance compared to conventional methods, as well as the reduction in relative estimation errors. Furthermore, empirical studies in financial portfolio optimization demonstrate the method’s significant risk management benefits, particularly its ability to effectively mitigate portfolio risk even with limited sample sizes. Full article
(This article belongs to the Special Issue Advances in High-Dimensional Data Analysis)
Show Figures

Figure 1

Back to TopTop