Entropy and information theory

書誌事項

Entropy and information theory

Robert M. Gray

Springer-Verlag, c1990

  • : gw
  • : us

大学図書館所蔵 件 / 66

この図書・雑誌をさがす

注記

Bibliography: p. 315-326

Includes index

内容説明・目次

巻冊次

: us ISBN 9780387973715

内容説明

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
巻冊次

: gw ISBN 9783540973713

内容説明

This text is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and compromise several quantitative notions of the information in random variables, random processes and dynamical systems. Examples are entropy, mutual information and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behaviour of sample information and expected information.

目次

Contents: Information Sources.- Entropy and Information.- The Entropy Ergodic Theorem.- Information Rates I.- Relative Entropy.- Information Rates II.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Channels and Codes.- Distortion.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- Index.

「Nielsen BookData」 より

詳細情報

ページトップへ