**Preface to "Robust Procedures for Estimating and Testing in the Framework of Divergence Measures"**

The approach for estimating and testing based on suitable divergence measures has become, in the last 30 years, a very popular technique not only in the field of statistics but also in other areas, such as machine learning, pattern recognition, etc. In relation to the estimation problem, it is necessary to minimize a suitable divergence measure between the data and the model under consideration. Some interesting examples of those estimators are the minimum phi-divergence estimators (MPHIE), in particular, the minimum Hellinger distance (MHD) and the minimum density power divergence estimators (MDPDE). The MPHIE are characterized by asymptotic efficiency (BAN estimators), the MHE by asymptotic efficiency and robustness inside the family of the MPHIE, and the MDPD by their robustness without a significant loss of efficiency as well as by the simplicity of getting them, because it is not necessary to use a non-parametric estimator of the true density function.

Based on these estimators of minimum divergence or distance, many people have studied the possibility to use them in order to obtain statistics for testing hypotheses. There are some possibilities to use them with that objective: (i) Plugging them in a divergence measure in order to obtain the estimated distance (divergence) between the model, whose parameters have been estimated under the null hypothesis and the model evaluated in all the parameter space; and (ii) extending the concept of the Wald test in the sense of considering MDPDE instead of maximum likelihood estimators. These test statistics have been considered in many different statistical problems: Censoring, equality of means in normal and lognormal models, logistic regression model, multinomial logistic regression, and GLM models in general, etc.

The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.

> **Leandro Pardo, Nirian Martin** *Editors*
