Large Sample Behavior of the Least Trimmed Squares Estimator
Abstract
:1. Introduction
- (a)
- Introducing a novel partition of the parameter space and defining an original population version of the LTS for the first time;
- (b)
- Investigating primary properties of the sample and population versions of the objective function for the LTS, obtaining original results;
- (c)
- For the first time, obtaining the influence function () and Fisher consistency for the LTS;
- (d)
- For the first time, establishing the strong consistency of the sample LTS via a generalized Glivenko-Cantelli Theorem without artificial assumptions; and
- (e)
- For the first time, employing a novel and concise approach based on the empirical process theory to establish the asymptotic normality of the sample LTS.
2. Definition and Properties of the LTS
2.1. Definition
2.2. Properties in the Empirical Case
- Existence and uniqueness
- Partition parameter space
- (i)
- (a) For any l (), over .(b) For any , there exists an open ball centered at η with a radius such that for any
- (ii)
- The graph of over is composed of the L closures of graphs of the quadratic function of β: for and any l (), joined together.
- (iii)
- is continuous in .
- (iv)
- is differentiable and strictly convex over each for any .
- (i)
- exists and is the local minimum of over for some ().
- (ii)
- Over , is the solution of the system of equations
- (iii)
- Over , the unique solution is
2.3. Properties in the Population Case
2.3.1. Definition of Influence Function
2.3.2. Existence and Uniqueness
2.3.3. Fisher Consistency
- (i)
- is invertible, and
- (ii)
- , where .
2.3.4. Influence Function
3. Asymptotic Properties
3.1. Strong Consistency
3.2. Root-n Consistency and Asymptotic Normality
- (i)
- is an interior point of the parameter set T;
- (ii)
- has a non-singular second derivative matrix V at ;
- (iii)
- ;
- (iv)
- The components of all belong to ;
- (v)
- The sequence is stochastically equicontinuous at ;
- (i)
- The uniqueness assumptions for and in Theorems 1 and 2 hold respectively;
- (ii)
- exists with ;
4. Inference Procedures
4.1. Equivariance
4.2. Transformation
- (1)
- and ,with , where and is the CDF of , is a chi-square random variable with one degree of freedom.
- (2)
- with .
- (3)
- , where C and are defined in (1) and (2) above.
4.3. Approximate Confidence Region
5. Concluding Remarks
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
MDPI | Multidisciplinary Digital Publishing Institute |
DOAJ | Directory of open-access journals |
TLA | Three-letter acronym |
LD | Linear dichroism |
Appendix A. Proofs
References
- Rousseeuw, P.J.; Leroy, A. Robust Regression and Outlier Detection; Wiley: New York, NY, USA, 1987. [Google Scholar]
- Huber, P.J. Robust estimation of a location parameter. Ann. Math. Stat. 1964, 35, 73–101. [Google Scholar] [CrossRef]
- Rousseeuw, P.J. Least median of squares regression. J. Am. Stat. Assoc. 1984, 79, 871–880. [Google Scholar] [CrossRef]
- Rousseeuw, P.J.; Yohai, V.J. Robust regression by means of S-estimators. In Robust and Nonlinear Time Series Analysis; Lecture Notes in Statistics; Springer: New York, NY, USA, 1984; Volume 26, pp. 256–272. [Google Scholar]
- Yohai, V.J. High breakdown-point and high efficiency estimates for regression. Ann. Stat. 1987, 15, 642–656. [Google Scholar] [CrossRef]
- Yohai, V.J.; Zamar, R.H. High breakdown estimates of regression by means of the minimization of an efficient scale. J. Am. Stat. Assoc. 1988, 83, 406–413. [Google Scholar] [CrossRef]
- Rousseeuw, P.J.; Hubert, M. Regression depth (with discussion). J. Am. Stat. Assoc. 1999, 94, 388–433. [Google Scholar] [CrossRef]
- Zuo, Y. On general notions of depth for regression. Stat. Sci. 2021, 36, 142–157. [Google Scholar] [CrossRef]
- Zuo, Y. Robustness of the deepest projection regression depth functional. Stat. Pap. 2021, 62, 1167–1193. [Google Scholar] [CrossRef]
- Rousseeuw, P.J.; Van Driessen, K. Computing LTS Regression for Large Data Sets. Data Min. Knowl. Discov. 2006, 12, 29–45. [Google Scholar] [CrossRef]
- Stromberg, A.J. Computation of High Breakdown Nonlinear Regression Parameters. J. Am. Stat. Assoc. 1993, 88, 237–244. [Google Scholar] [CrossRef]
- Hawkins, D.M. The feasible solution algorithm for least trimmed squares regression. Comput. Stat. Data Anal. 1994, 17, 185–196. [Google Scholar] [CrossRef]
- Hössjer, O. Exact computation of the least trimmed squares estimate in simple linear regression. Comput. Stat. Data Anal. 1995, 19, 265–282. [Google Scholar] [CrossRef]
- Rousseeuw, P.J.; Van Driessen, K. A fast algorithm for the minimum covariance determinant estimator. Technometrics 1999, 41, 212–223. [Google Scholar] [CrossRef]
- Hawkins, D.M.; Olive, D.J. Improved feasible solution algorithms for high breakdown estimation. Comput. Stat. Data Anal. 1999, 30, 1–11. [Google Scholar] [CrossRef]
- Agullö, J. New algorithms for computing the least trimmed squares regression estimator. Comput. Stat. Data Anal. 2001, 36, 425–439. [Google Scholar] [CrossRef]
- Hofmann, M.; Gatu, C.; Kontoghiorghes, E.J. An Exact Least Trimmed Squares Algorithm for a Range of Coverage Values. J. Comput. Andgr. Stat. 2010, 19, 191–204. [Google Scholar] [CrossRef]
- Klouda, K. An Exact Polynomial Time Algorithm for Computing the Least Trimmed Squares Estimate. Comput. Stat. Data Anal. 2015, 84, 27–40. [Google Scholar] [CrossRef]
- Alfons, A.; Croux, C.; Gelper, S. Sparse least trimmed squares regression for analyzing high-dimensional large data sets. Ann. Appl. Stat. 2013, 7, 226–248. [Google Scholar] [CrossRef]
- Kurnaz, F.S.; Hoffmann, I.; Filzmoser, P. Robust and sparse estimation methods for high-dimensional linear and logistic regression. Chemom. Intell. Lab. Syst. 2018, 172, 211–222. [Google Scholar] [CrossRef]
- Mašíček, L. Optimality of the Least Weighted Squares Estimator. Kybernetika 2004, 40, 715–734. [Google Scholar]
- Všek, J.Á. The least trimmed squares. Part I: Consistency. Kybernetika 2006, 42, 1–36. [Google Scholar]
- Všek, J.Á. The least trimmed squares. Part II: -consistency. Kybernetika 2006, 42, 181–202. [Google Scholar]
- Všek, J.Á. The least trimmed squares. Part III: Asymptotic normality. Kybernetika 2006, 42, 203–224. [Google Scholar]
- Chen, Y.; Stromberg, A.; Zhou, M. The Least Trimmed Squares Estimate in Nonlinear Regression; Technical Report, 1997/365; Department of Statistics, University of Kentucky: Lexington, KY, USA, 1997. [Google Scholar]
- Čížek, P. Asymptotics of Least Trimmed Squares Regression; CentER Discussion Paper 2004-72; Tilburg University: Tilburg, The Netherlands, 2004. [Google Scholar]
- Čížek, P. Least Trimmed Squares in nonlinear regression under dependence. J. Stat. Plan. Inference 2005, 136, 3967–3988. [Google Scholar] [CrossRef]
- Zuo, Y.; Zuo, H. Least sum of squares of trimmed residuals regression. Electron. J. Stat. 2023, 17, 2447–2484. [Google Scholar] [CrossRef]
- Hampel, F.R.; Ronchetti, E.M.; Rousseeuw, P.J.; Stahel, W.A. Robust Statistics: The Approach Based on Influence Functions; John Wiley & Sons: New York, NY, USA, 1986. [Google Scholar]
- Tableman, M. The influence functions for the least trimmed squares and the least trimmed absolute deviations estimators. Stat. Probab. Lett. 1994, 19, 329–337. [Google Scholar] [CrossRef]
- Öllerer, V.; Croux, C.; Alfons, A. The influence function of penalized regression estimators. Statistics 2015, 49, 741–765. [Google Scholar] [CrossRef]
- Pollard, D. Convergence of Stochastic Processes; Springer: Berlin, Germany, 1984. [Google Scholar]
- Bednarski, T.; Clarke, B.R. Trimmed likelihood estimation of location and scale of the normal distribution. Aust. J. Stat. 1993, 35, 141–153. [Google Scholar] [CrossRef]
- Butler, R.W. Nonparametric interval point prediction using data trimmed by a Grubbs type outlier rule. Ann. Stat. 1982, 10, 197–204. [Google Scholar] [CrossRef]
- Hössjer, O. Rank-Based Estimates in the Linear Model with High Breakdown Point. J. Am. Stat. Assoc. 1994, 89, 149–158. [Google Scholar] [CrossRef]
- Zuo, Y. A new approach for the computation of halfspace depth in high dimensions. Commun. Stat. Simul. Comput. 2018, 48, 900–921. [Google Scholar] [CrossRef]
- Zuo, Y. Projection-based depth functions and associated medians. Ann. Stat. 2003, 31, 1460–1490. [Google Scholar] [CrossRef]
- Shao, W.; Zuo, Y.; Luo, J. Employing the MCMC Technique to Compute the Projection Depth in High Dimensions. J. Comput. Appl. Math. 2022, 411, 114278. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zuo, Y. Large Sample Behavior of the Least Trimmed Squares Estimator. Mathematics 2024, 12, 3586. https://doi.org/10.3390/math12223586
Zuo Y. Large Sample Behavior of the Least Trimmed Squares Estimator. Mathematics. 2024; 12(22):3586. https://doi.org/10.3390/math12223586
Chicago/Turabian StyleZuo, Yijun. 2024. "Large Sample Behavior of the Least Trimmed Squares Estimator" Mathematics 12, no. 22: 3586. https://doi.org/10.3390/math12223586
APA StyleZuo, Y. (2024). Large Sample Behavior of the Least Trimmed Squares Estimator. Mathematics, 12(22), 3586. https://doi.org/10.3390/math12223586