Arşiv logosu
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
Arşiv logosu
  • Koleksiyonlar
  • DSpace İçeriği
  • Analiz
  • Talep/Soru
  • Türkçe
  • English
  • Giriş
    Yeni kullanıcı mısınız? Kayıt için tıklayın. Şifrenizi mi unuttunuz?
  1. Ana Sayfa
  2. Yazara Göre Listele

Yazar "Evren, Atif Ahmet" seçeneğine göre listele

Listeleniyor 1 - 3 / 3
Sayfa Başına Sonuç
Sıralama seçenekleri
  • Küçük Resim Yok
    Öğe
    Do some relative entropy measures coincide in determining correlations or associations for metric data?
    (Selçuk Üniversitesi, 2015) Evren, Atif Ahmet; Dincer, Gokhan
    Entropy is a measure of uncertainty of a statistical experiment or the measure of information provided by experimentation. Several measures of entropy are used in uncertainty considerations for nominal, ordinal (as well as metric) data and specifically in qualitative variation calculations. Besides, relative entropy concepts (e.g. mutual information, etc.) are used in goodness of fit tests or in checking the adequacy of any statistical model in general. In particular, relative entropy measures are used in correlation or association estimations. In this study, based on a specific definition of mutual information, we use some different relative entropy measures. Then we compare these measures under three different situations by some applications.
  • Küçük Resim Yok
    Öğe
    Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations
    (Selçuk Üniversitesi, 2012) Evren, Atif Ahmet
    Some of the entropy measures proposed are Shannon entropy(1948), Rényi entropy (1961), Havrda&Charvát entropy(1967), and Tsallis entropy (1988). The limit of Rényi divergence is relative entropy (or Kullback-Leibler divergence) which is a measure of discrepancy between two statistical hypotheses or two probability distributions. Jeffreys’ divergence is a measure of difficulty of making a discrimination between two probability distributions. These divergence measures are related to some chi-square distributions asymptotically such that they can be used in some hypothesis tests. In this study I try to show that entropy based statistics like Kullback-Leibler divergence and Jeffreys’ divergence can be used in some statistical hypothesis tests for multinomial populations by some examples.
  • Küçük Resim Yok
    Öğe
    Some Applıcatıons of Entropy-Based Statıstıcs ın Lınear Regressıon Analysıs
    (Selçuk Üniversitesi, 2012) Evren, Atif Ahmet; Tuna, Elif
    Statistical entropy is a measure of variation of a distribution especially when the random variable is qualitative. Entropy-based statistics are also used to measure the degree of association between qualitative variables. Two measures of divergence, namely, Kullback-Leibler divergence and Jeffreys’ divergence are closely related to loglikelihood function. Thus these two entropy-based measures can be used in hypothesis testing procedures as well. In this study, we discuss how relative entropy measures are applied in testing some hypotheses and how useful they would be in regression analysis especially in determining influential observations.

| Selçuk Üniversitesi | Kütüphane | Açık Erişim Politikası | Rehber | OAI-PMH |

Bu site Creative Commons Alıntı-Gayri Ticari-Türetilemez 4.0 Uluslararası Lisansı ile korunmaktadır.


Selçuk Üniversitesi Kütüphane ve Dokümantasyon Daire Başkanlığı, Konya, TÜRKİYE
İçerikte herhangi bir hata görürseniz lütfen bize bildirin

DSpace 7.6.1, Powered by İdeal DSpace

DSpace yazılımı telif hakkı © 2002-2025 LYRASIS

  • Çerez Ayarları
  • Gizlilik Politikası
  • Son Kullanıcı Sözleşmesi
  • Geri Bildirim