Universal estimation of information measures for analog sources

著者

    • Wang, Qing
    • Kulkarni, Sanjeev
    • Verdú, Sergio

書誌事項

Universal estimation of information measures for analog sources

Qing Wang, Sanjeev Kulkarni, and Sergio Verdú

(Foundations and trends [TM] in communications and information theory, 5:3)

now Publishers, c2009

大学図書館所蔵 件 / 1

この図書・雑誌をさがす

注記

"The preferred citation for this publication is ... Foundation and Trends [R] in Communications and Information Theory, vol 5, nos 3, pp 265-353, 2008"-T. p. verso

Bibliography: p. 77-93

内容説明・目次

内容説明

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.

目次

1: Introduction 2: Plug-in Algorithms 3: Algorithms Based on Partitioning 4: Algorithms based on k-Nearest-Neighbor Distances 5: Other Algorithms 6: Algorithm Summary and Experiments 7: Sources with Memory References

「Nielsen BookData」 より

関連文献: 1件中  1-1を表示

詳細情報

  • NII書誌ID(NCID)
    BC0385256X
  • ISBN
    • 9781601982308
  • 出版国コード
    us
  • タイトル言語コード
    eng
  • 本文言語コード
    eng
  • 出版地
    Hanover, MA
  • ページ数/冊数
    ix, 93 p.
  • 大きさ
    24 cm
  • 親書誌ID
ページトップへ