Entropy and information theory
Author(s)
Bibliographic Information
Entropy and information theory
Springer-Verlag, c1990
- : gw
- : us
Available at 66 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Note
Bibliography: p. 315-326
Includes index
Description and Table of Contents
- Volume
-
: us ISBN 9780387973715
Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
- Volume
-
: gw ISBN 9783540973713
Description
This text is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and compromise several quantitative notions of the information in random variables, random processes and dynamical systems. Examples are entropy, mutual information and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behaviour of sample information and expected information.
Table of Contents
Contents: Information Sources.- Entropy and Information.- The Entropy Ergodic Theorem.- Information Rates I.- Relative Entropy.- Information Rates II.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Channels and Codes.- Distortion.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- Index.
by "Nielsen BookData"