New Developments in Statistical Information Theory Based on Entropy and Divergence Measures - Leandro Pardo - Libros - Mdpi AG - 9783038979364 - 20 de mayo de 2019
En caso de que portada y título no coincidan, el título será el correcto

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Precio
€ 61,49

Pedido desde almacén remoto

Entrega prevista 1 - 12 de ene. de 2026
Los regalos de Navidad se podrán canjear hasta el 31 de enero
Añadir a tu lista de deseos de iMusic

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Medios de comunicación Libros     Paperback Book   (Libro con tapa blanda y lomo encolado)
Publicado 20 de mayo de 2019
ISBN13 9783038979364
Editores Mdpi AG
Páginas 344
Dimensiones 170 × 244 × 24 mm   ·   734 g
Lengua Inglés  

Mas por Leandro Pardo

Mostrar todo