Minimum Penalized Hellinger Distance for Model Selection in Small Samples

HTML  Download Download as PDF (Size: 916KB)  PP. 369-382  
DOI: 10.4236/ojs.2012.24045    3,891 Downloads   6,225 Views  Citations

ABSTRACT

In statistical modeling area, the Akaike information criterion AIC, is a widely known and extensively used tool for model choice. The φ-divergence test statistic is a recently developed tool for statistical model selection. The popularity of the divergence criterion is however tempered by their known lack of robustness in small sample. In this paper the penalized minimum Hellinger distance type statistics are considered and some properties are established. The limit laws of the estimates and test statistics are given under both the null and the alternative hypotheses, and approximations of the power functions are deduced. A model selection criterion relative to these divergence measures are developed for parametric inference. Our interest is in the problem to testing for choosing between two models using some informational type statistics, when independent sample are drawn from a discrete population. Here, we discuss the asymptotic properties and the performance of new procedure tests and investigate their small sample behavior.

Share and Cite:

P. Ngom and B. Ntep, "Minimum Penalized Hellinger Distance for Model Selection in Small Samples," Open Journal of Statistics, Vol. 2 No. 4, 2012, pp. 369-382. doi: 10.4236/ojs.2012.24045.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.