Entropy and information theory
著者
書誌事項
Entropy and information theory
Springer, c2011
2nd ed
大学図書館所蔵 件 / 全24件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
注記
Includes bibliographical references (p. 395-403) and index
内容説明・目次
内容説明
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.
New in this edition:
Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes
Expanded discussion of results from ergodic theory relevant to information theory
Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources
New material on trading off information and distortion, including the Marton inequality
New material on the properties of optimal and asymptotically optimal source codes
New material on the relationships of source coding and rate-constrained simulation or modeling of random processes
Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
目次
Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Index
「Nielsen BookData」 より