Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations

Küçük Resim Yok

Tarih

2012

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Selçuk Üniversitesi

Erişim Hakkı

info:eu-repo/semantics/openAccess

Özet

Some of the entropy measures proposed are Shannon entropy(1948), Rényi entropy (1961), Havrda&Charvát entropy(1967), and Tsallis entropy (1988). The limit of Rényi divergence is relative entropy (or Kullback-Leibler divergence) which is a measure of discrepancy between two statistical hypotheses or two probability distributions. Jeffreys’ divergence is a measure of difficulty of making a discrimination between two probability distributions. These divergence measures are related to some chi-square distributions asymptotically such that they can be used in some hypothesis tests. In this study I try to show that entropy based statistics like Kullback-Leibler divergence and Jeffreys’ divergence can be used in some statistical hypothesis tests for multinomial populations by some examples.

Açıklama

Anahtar Kelimeler

Goodness of fit, Jeffreys’ divergence, Kullback-Leibler divergence, Shannon entropy

Kaynak

Journal of Selcuk University Natural and Applied Science

WoS Q Değeri

Scopus Q Değeri

Cilt

1

Sayı

4

Künye

Evren, A. A. (2012). Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations. Journal of Selcuk University Natural and Applied Science, 1, (4), 48-58.