Yazar "Evren, Atif Ahmet" seçeneğine göre listele
Listeleniyor 1 - 3 / 3
Sayfa Başına Sonuç
Sıralama seçenekleri
Öğe Do some relative entropy measures coincide in determining correlations or associations for metric data?(Selçuk Üniversitesi, 2015) Evren, Atif Ahmet; Dincer, GokhanEntropy is a measure of uncertainty of a statistical experiment or the measure of information provided by experimentation. Several measures of entropy are used in uncertainty considerations for nominal, ordinal (as well as metric) data and specifically in qualitative variation calculations. Besides, relative entropy concepts (e.g. mutual information, etc.) are used in goodness of fit tests or in checking the adequacy of any statistical model in general. In particular, relative entropy measures are used in correlation or association estimations. In this study, based on a specific definition of mutual information, we use some different relative entropy measures. Then we compare these measures under three different situations by some applications.Öğe Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations(Selçuk Üniversitesi, 2012) Evren, Atif AhmetSome of the entropy measures proposed are Shannon entropy(1948), Rényi entropy (1961), Havrda&Charvát entropy(1967), and Tsallis entropy (1988). The limit of Rényi divergence is relative entropy (or Kullback-Leibler divergence) which is a measure of discrepancy between two statistical hypotheses or two probability distributions. Jeffreys’ divergence is a measure of difficulty of making a discrimination between two probability distributions. These divergence measures are related to some chi-square distributions asymptotically such that they can be used in some hypothesis tests. In this study I try to show that entropy based statistics like Kullback-Leibler divergence and Jeffreys’ divergence can be used in some statistical hypothesis tests for multinomial populations by some examples.Öğe Some Applıcatıons of Entropy-Based Statıstıcs ın Lınear Regressıon Analysıs(Selçuk Üniversitesi, 2012) Evren, Atif Ahmet; Tuna, ElifStatistical entropy is a measure of variation of a distribution especially when the random variable is qualitative. Entropy-based statistics are also used to measure the degree of association between qualitative variables. Two measures of divergence, namely, Kullback-Leibler divergence and Jeffreys’ divergence are closely related to loglikelihood function. Thus these two entropy-based measures can be used in hypothesis testing procedures as well. In this study, we discuss how relative entropy measures are applied in testing some hypotheses and how useful they would be in regression analysis especially in determining influential observations.