Evren, Atif Ahmet2020-12-212020-12-212012Evren, A. A. (2012). Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations. Journal of Selcuk University Natural and Applied Science, 1, (4), 48-58.2147-3781https://hdl.handle.net/20.500.12395/40863Some of the entropy measures proposed are Shannon entropy(1948), Rényi entropy (1961), Havrda&Charvát entropy(1967), and Tsallis entropy (1988). The limit of Rényi divergence is relative entropy (or Kullback-Leibler divergence) which is a measure of discrepancy between two statistical hypotheses or two probability distributions. Jeffreys’ divergence is a measure of difficulty of making a discrimination between two probability distributions. These divergence measures are related to some chi-square distributions asymptotically such that they can be used in some hypothesis tests. In this study I try to show that entropy based statistics like Kullback-Leibler divergence and Jeffreys’ divergence can be used in some statistical hypothesis tests for multinomial populations by some examples.eninfo:eu-repo/semantics/openAccessGoodness of fitJeffreys’ divergenceKullback-Leibler divergenceShannon entropySome applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populationsArticle144858