Relative information : theories and applications
著者
書誌事項
Relative information : theories and applications
(Springer series in synergetics, v. 47)
Springer-Verlag, c1990
- : [pbk.]
大学図書館所蔵 全3件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
注記
"Softcover reprint of the hardcover 1st edition 1990" -- t.p. verso
Includes bibliographical references (p. [251]-254) and index
内容説明・目次
内容説明
For four decades, information theory has been viewed almost exclusively as a theory based upon the Shannon measure of uncertainty and information, usually referred to as Shannon entropy. Since the publication of Shannon's seminal paper in 1948, the theory has grown extremely rapidly and has been applied with varied success in almost all areas of human endeavor. At this time, the Shannon information theory is a well established and developed body of knowledge. Among its most significant recent contributions have been the use of the complementary principles of minimum and maximum entropy in dealing with a variety of fundamental systems problems such as predic tive systems modelling, pattern recognition, image reconstruction, and the like. Since its inception in 1948, the Shannon theory has been viewed as a restricted information theory. It has often been argued that the theory is capable of dealing only with syntactic aspects of information, but not with its semantic and pragmatic aspects. This restriction was considered a v~rtue by some experts and a vice by others. More recently, however, various arguments have been made that the theory can be appropriately modified to account for semantic aspects of in formation as well. Some of the most convincing arguments in this regard are in cluded in Fred Dretske's Know/edge & Flow of Information (The M.LT. Press, Cambridge, Mass., 1981) and in this book by Guy lumarie.
目次
1. Relative Information - What For?.- 1.1 Information Theory, What Is It?.- 1.1.1 Summary of the Story.- 1.1.2 Communication and Information.- 1.2 Information and Natural Language.- 1.2.1 Syntax, Semantics, Lexeme.- 1.2.2 Information, Learning, Dislearning.- 1.3 Prerequisites for a Theory of Information.- 1.3.1 Relativity of Information.- 1.3.2 Negative Information.- 1.3.3 Entropy of Form and Pattern.- 1.3.4 Information and Thermodynamics.- 1.3.5 Information and Subjectivity.- 1.4 Information and Systems.- 1.4.1 A Model of General Systems.- 1.4.2 A Model of Relative Information.- 1.4.3 A Few Comments.- 1.5 How We Shall Proceed.- 1.5.1 Aim of the Book.- 1.5.2 Subjective Information and Relative Information.- 1.5.3 Minkowskian Observation of Events.- 1.5.4 A Unified Approach to Discrete Entropy and Continuous Entropy.- 1.5.5 A Word of Caution to the Reader.- 2. Information Theory - The State of the Art.- 2.1 Introduction.- 2.2 Shannon Measure of Uncertainty.- 2.2.1 The Probabilistic Framework.- 2.2.2 Shannon Informational Entropy.- 2.2.3 Entropy of Random Variables.- 2.3 An Intuitive Approach to Entropy.- 2.3.1 Uniform Random Experiments.- 2.3.2 Non Uniform Random Experiments.- 2.4 Conditional Entropy.- 2.4.1 Framework of Random Experiments.- 2.4.2 Application to Random Variables.- 2.5 A Few Properties of Discrete Entropy.- 2.6 Prior Characterization of Discrete Entropy.- 2.6.1 Properties of Uncertainty.- 2.6.2 Some Consequences of These Properties.- 2.7 The Concept of Information.- 2.7.1 Shannon Information.- 2.7.2 Some Properties of Transinformation.- 2.7.3 Transinformation of Random Variables.- 2.7.4 Remarks on the Notation.- 2.8 Conditional Transinformation.- 2.8.1 Main Definition.- 2.8.2 Some Properties of Conditional Transinformation.- 2.8.3 Conditional Transinformation of Random Variables.- 2.9 Renyi Entropy.- 2.9.1 Definition of Renyi Entropy.- 2.9.2 Meaning of the Renyi Entropy.- 2.9.3 Some Properties of the Renyi Entropy.- 2.10 Cross-Entropy or Relative Entropy.- 2.10.1 The Main Definition.- 2.10.2 A Few Comments.- 2.11 Further Measures of Uncertainty.- 2.11.1 Entropy of Degree c.- 2.11.2 Quadratic Entropy.- 2.11.3 R norm Entropy.- 2.11.4 Effective Entropy.- 2.12 Entropies of Continuous Variables.- 2.12.1 Continuous Shannon Entropy.- 2.12.2 Some Properties of Continuous Entropy.- 2.12.3 Continuous Transinformation.- 2.12.4 Further Extensions.- 2.13 Hatori's Derivation of Continuous Entropy.- 2.14 Information Without Probability.- 2.14.1 A Functional Approach.- 2.14.2 Relative Information.- 2.15 Information and Possibility.- 2.15.1 A Few Prerequisites.- 2.15.2 A Measure of Uncertainty Without Probability.- 2.16 Conclusions.- 3. A Critical Review of Shannon Information Theory.- 3.1 Introduction.- 3.2 On the Invariance of Measures of Information.- 3.3 On the Modelling of Negative Transinformation.- 3.3.1 Classification of Terms.- 3.3.2 The Problem of Modelling "True" and "False".- 3.3.3 Error-Detecting Codes.- 3.4 On the Symmetry of Transinformation.- 3.4.1 A Diverting Example.- 3.4.2 Application of Information Theory.- 3.4.3 On the Symmetry of Transinformation.- 3.4.4 On a Possible Application of Renyi Entropy.- 3.5 Entropy and the Central Limit Theorem.- 3.5.1 The Central Limit Theorem.- 3.5.2 An Information Theoretical Approach to the Central Limit Theorem.- 3.5.3 Relation with Thermodynamics.- 3.5.4 Continuous Entropy Versus Discrete Entropy.- 3.6 On the Entropy of Continuous Variables.- 3.6.1 The Sign of the Continuous Entropy.- 3.6.2 A Nice Property of Continuous Entropy.- 3.7 Arguments to Support Continuous Entropy.- 3.7.1 On the Negativeness of Continuous Entropy.- 3.7.2 On the Non-invariance of Continuous Entropy.- 3.7.3 Channel Capacity in the Presence of Noise.- 3.8 The Maximum Entropy Principle.- 3.8.1 Statement of the Principle.- 3.8.2 Some Examples.- 3.9 Arguments to Support the Maximum Entropy Principle.- 3.9.1 Information Theoretical Considerations.- 3.9.2 Thermodynamic Considerations.- 3.9.3 Axiomatic Derivation.- 3.9.4 A Few Comments.- 3.10 Information, Syntax, Semantics.- 3.10.1 On the Absolute Nature of Information.- 3.10.2 Information and Natural Language.- 3.11 Information and Thermodynamics.- 3.11.1 Informational and Thermodynamic Entropy.- 3.11.2 Thermodynamic Entropy of Open Systems.- 3.12 Conclusions.- 4. A Theory of Relative Information.- 4.1 Introduction.- 4.2 Observation, Aggregation, Invariance.- 4.2.1 Principle of Aggregation.- 4.2.2 Principle of Invariance.- 4.2.3 A Few Comments.- 4.3 Observation with Informational Invariance.- 4.4 Euclidean Invariance.- 4.4.1 Orthogonal Transformation.- 4.4.2 Application to the Observation of Probabilities.- 4.4.3 Application to the Observation of Classes.- 4.5 Minkowskian Invariance.- 4.5.1 Lorentz Transformation.- 4.5.2 Application to the Observation of Probabilities.- 4.5.3 Application to the Observation of Classes.- 4.6 Euclidean or Minkowskian Observation?.- 4.6.1 Selection of the Observation Mode.- 4.6.2 Application to the [Uncertainty, Information] Pair.- 4.7 Information Processes and Natural Languages.- 4.7.1 On the Absoluteness of Information.- 4.7.2 The Information Process.- 4.7.3 Natural Language.- 4.7.4 Information and Natural Language.- 4.8 Relative Informational Entropy.- 4.8.1 Introduction to the Relative Observation.- 4.8.2 Informational Invariance of the Observation.- 4.8.3 Relative Entropy.- 4.8.4 Comments and Remarks.- 4.9 Conditional Relative Entropy.- 4.9.1 Relative Entropy of Product of Messages.- 4.9.2 Composition Law for Cascaded Observers.- 4.9.3 Relative Entropy Conditional to a Given Experiment.- 4.9.4 Applications to Determinacy.- 4.9.5 Comparison of H(?/?) with Hr(?/?).- 4.10 On the Meaning and the Estimation of the Observation Parameter.- 4.10.1 Estimation of the Observation Parameter.- 4.10.2 Practical Meaning of the Observation Parameter.- 4.10.3 On the Value of the Observation Parameter u(R).- 4.11 Relative Transinformation.- 4.11.1 Derivation of Relative Transinformation.- 4.11.2 Some Properties of the Relative Transinformation.- 4.11.3 Relative Entropy and Information Balance.- 4.11.4 Application to the Encoding Problem.- 4.12 Minkowskian Relative Transinformation.- 4.12.1 Definition of Minkowskian Relative Transinformation.- 4.12.2 Some Properties of Minkowskian Relative Transinformation.- 4.12.3 Identification via Information Balance.- 4.13 Effect of Scaling Factor in an Observation with Informational Invariance.- 4.14 Comparison with Renyi Entropy.- 4.14.1 Renyi Entropy and Relative Entropy.- 4.14.2 Transinformation of Order c and Relative Transinformation.- 5. A Theory of Subjective Information.- 5.1 Introduction.- 5.2 Subjective Entropy.- 5.2.1 Definition of Subjective Entropy.- 5.2.2 A Few Remarks.- 5.3 Conditional Subjective Entropy.- 5.3.1 Definition of Conditional Subjective Entropy.- 5.3.2 Application to Determinacy.- 5.3.3 A Basic Inequality.- 5.4 Subjective Transinformation.- 5.4.1 Introduction.- 5.4.2 Subjective Transinformation.- 5.4.3 A Few Properties of Subjective Transinformation.- 5.4.4 Application to Independent Random Experiments.- 5.5 Conditional Subjective Transinformation.- 5.5.1 Definition.- 5.5.2 A Few Properties of Subjective Conditional Transinformation.- 5.6 Information Balance.- 5.6.1 Optimum Conditions for Information Balance.- 5.6.2 Non-optimum Conditions for Information Balance.- 5.7 Explicit Expression of Subjective Transinformation.- 5.7.1 Discrete Probability.- 5.7.2 Continuous Probability.- 5.8 The General Coding Problem.- 5.8.1 Preliminary Remarks.- 5.8.2 The General Coding Problem.- 5.8.3 On the Problem of Error Correcting Codes Revisited.- 5.9 Capacity of a Channel.- 5.9.1 The General Model.- 5.9.2 Channel with Noise.- 5.9.3 Channel with Noise and Filtering.- 5.10 Transinformation in the Presence of Fuzziness.- 5.10.1 On the Entropy of a Fuzzy Set.- 5.10.2 Application of Subjective Transinformation.- 5.10.3 The Brillouin Problem.- 5.11 On the Use of Renyi Entropy.- 5.11.1 Renyi Entropy and Subjective Entropy.- 5.11.2 Transinformation of Order c and Shannon Transinformation.- 6. A Unified Approach to Discrete and Continuous Entropy.- 6.1 Introduction.- 6.2 Intuitive Derivation of "Total Entropy".- 6.2.1 Preliminary Definitions and Notation.- 6.2.2 Physical Derivation of He (X).- 6.3 Mathematical Derivation of Total Entropy.- 6.3.1 The Main Axioms.- 6.3.2 Derivation of Total Entropy.- 6.3.3 On the Expression of the Total Entropy.- 6.4 Alternative Set of Axioms for the Total Entropy.- 6.4.1 Generalization of Shannon Recurrence Equation.- 6.4.2 A Model via Uniform Interval of Definition.- 6.5 Total Entropy with Respect to a Measure.- 6.6 Total Renyi Entropy.- 6.6.1 Preliminary Remarks About Renyi Entropy.- 6.6.2 Axioms for Total Renyi Entropy.- 6.6.3 Total Renyi Entropy.- 6.6.4 Total Renyi Entropy with Respect to a Measure.- 6.7 On the Practical Meaning of Total Entropy.- 6.7.1 General Remarks.- 6.7.2 Total Entropy and Relative Entropy.- 6.7.3 Total Entropy and Subjective Entropy.- 6.8 Further Results on the Total Entropy.- 6.8.1 Some Mathematical Properties of the Total Entropy.- 6.8.2 On the Entropy of Pattern.- 6.9 Transinformation and Total Entropy.- 6.9.1 Total Entropy of a Random Vector.- 6.9.2 Conditional Total Entropy.- 6.9.3 On the Definition of Transinformation.- 6.9.4 Total Kullback Entropy.- 6.10 Relation Between Total Entropy and Continuous Entropy.- 6.10.1 Total Shannon Entropy and Continuous Entropy.- 6.10.2 Total Renyi Entropy and Continuous Renyi Entropy.- 6.10.3 Application to an Extension Principle.- 6.11 Total Entropy and Mixed Theory of Information.- 6.11.1 Background to Effective Entropy and Inset Entropy.- 6.11.2 Inset Entropy is an Effective Entropy.- 7. A Unified Approach to Informational Entropies via Minkowskian Observation.- 7.1 Introduction.- 7.2 Axioms of the Minkowskian Observation.- 7.2.1 Definition of the Observation Process.- 7.2.2 Statement of the Axioms.- 7.2.3 A Few Comments.- 7.3 Properties of the Minkowskian Observation.- 7.3.1 Learning Process.- 7.3.2 Practical Determination of u.- 7.3.3 Cascaded Observation of Variables.- 7.4 Minkowskian Observations in ?3.- 7.4.1 Derivation of the Equations.- 7.4.2 A Few Comments.- 7.5 The Statistical Expectation Approach to Entropy.- 7.5.1 Preliminary Remarks.- 7.5.2 Entropy and Statistical Expectation.- 7.6 Relative Entropy and Subjective Entropy.- 7.6.1 Comparison of Hr with Hs.- 7.6.2 Comparison of T(?/?) with J(?/?).- 7.7 Weighted Relative Entropy.- 7.7.1 Background on Weighted Entropy.- 7.7.2 Weighted Relative Entropy.- 7.7.3 Weighted Relative Entropy of a Vector.- 7.7.4 Total Weighted Relative Entropy.- 7.7.5 Observation and Subjectivity.- 7.8 Weighted Transinformation.- 7.8.1 Some Preliminary Remarks.- 7.8.2 Weighted Relative Conditional Entropy.- 7.8.3 Weighted Transinformation.- 7.8.4 Relation to Renyi Entropy.- 7.9 Weighted Cross-Entropy, Weighted Relative Entropy.- 7.9.1 Background on Kullback Entropy.- 7.9.2 Weighted Kullback Entropy.- 7.10 Weighted Relative Divergence.- 7.11 Application to Continuous Distributions.- 7.11.1 The General Principle of the Derivations.- 7.11.2 Continuous Weighted Relative Entropy.- 7.11.3 Continuous Weighted Kullback Entropy.- 7.11.4 Continuous Weighted Relative Divergence.- 7.12 Transformation of Variables in Weighted Relative Entropy.- 7.13 Conclusions.- 8. Entropy of Form and Pattern.- 8.1 Introduction.- 8.1.1 Entropy of Form and Fractal Dimension.- 8.1.2 Entropy and Pattern Representation.- 8.1.3 Randomized Definition or Geometrical Approach?.- 8.2 Total Entropy of a Random Vector.- 8.2.1 Definition of the Total Entropy of an m-Vector.- 8.2.2 Some Properties of the Total Entropy of m-Vectors.- 8.3 Total Entropy of a Discrete Quantized Stochastic Process.- 8.3.1 Preliminary Remarks.- 8.3.2 Axioms for the Total Uncertainty of a Discrete Trajectory.- 8.3.3 Expression for the Total Entropy of the Trajectory.- 8.3.4 Some Properties of the Total Entropy of the Trajectory.- 8.3.5 A Generalization.- 8.4 Total Renyi Entropy of Discrete Quantized Stochastic Processes.- 8.4.1 Total Renyi Entropy of a Vector.- 8.4.2 Derivation of the Total Renyi Entropy.- 8.5 Entropy of Order c Revisited.- 8.5.1 Preliminary Remarks.- 8.5.2 A New Definition of Total Entropy of Order c.- 8.6 Entropy of Continuous White Stochastic Trajectories.- 8.6.1 Notation and Remarks.- 8.6.2 A List of Desiderata for the Entropy of a White Trajectory.- 8.6.3 Towards a Mathematical Expression of the Trajectory Entropy.- 8.6.4 Entropy of Continuous White Trajectories.- 8.7 Entropy of Form and Observation Modes.- 8.7.1 Relative Uncertainty via Observation Modes.- 8.7.2 Classification of Observation Modes.- 8.7.3 Trajectory Entropy via White Observation.- 8.8 Trajectory Entropies of Stochastic Processes.- 8.8.1 Trajectory Shannon Entropy.- 8.8.2 Renyi Entropy of a Stochastic Trajectory.- 8.8.3 A Few Comments.- 8.9 Trajectory Entropy of a Stochastic Process Under Local Markovian Observation.- 8.9.1 Markovian Processes.- 8.9.2 Non-Markovian Processes.- 8.9.3 Transformation of Variables.- 8.9.4 A Few Remarks.- 8.10 Trajectory Entropy of a Stochastic Process Under Global Markovian Observation.- 8.10.1 Markovian Processes.- 8.10.2 Non-Markovian Processes.- 8.10.3 Transformation of Variables.- 8.10.4 A Few Remarks.- 8.11 Trajectory Entropies of Stochastic Vectors.- 8.11.1 Notation and Preliminary Results.- 8.11.2 Trajectory Entropies in ?n.- 8.12 Transinformation of Stochastic Trajectories.- 8.12.1 Definition of Transinformation of Trajectories.- 8.12.2 Transinformation Measures of Stochastic Trajectories.- 8.12.3 Application to the Derivation of Conditional Trajectory Entropies.- 8.13 On the Entropy of Deterministic Pattern.- 8.13.1 Background on Some Results.- 8.13.2 On the Entropy of Deterministic Pattern.- 8.13.3 Dependence of the Entropy upon the Observation Mode.- 8.14 Entropy of a Differentiable Mapping.- 8.14.1 Entropy with Respect to a Family of Distributions.- 8.14.2 Maximum Entropy of a Differentiable Mapping.- 8.14.3 Entropy of a Mapping Defined on a Finite Interval.- 8.14.4 Entropy of a Mapping with an Incomplete Probability Distribution.- 8.15 Entropy of Degree d of Differential Mappings.- 8.15.1 Trajectory Entropy of Degree d.- 8.15.2 Some Properties of Trajectory Entropy of Degree d.- 8.15.3 Practical Meaning of Trajectory Entropy of Degree d.- 8.16 Transinformation Between Differentiable Mappings.- 8.16.1 The Trajectory Entropy of Compositions of Functions.- 8.16.2 Application to Transinformation Between Mappings.- 8.17 Trajectory Entropy of Degree d in Intrinsic Coordinates.- 8.18 Trajectory Entropy of ?n??n Mapping.- 8.19 Trajectory Entropy of Degree d of a Discrete Mapping.- 8.20 Trajectory Entropy and Liapunov Exponent.- 9. A Theory of Relative Statistical Decision.- 9.1 Introduction.- 9.2 Background on Observation with Information Invariance.- 9.2.1 Euclidean or Minkowskian Invariance.- 9.2.2 On the Practical Meaning of the Model.- 9.2.3 Syntax and Semantics.- 9.3 Relative Probability.- 9.3.1 Definition of Relative Probability.- 9.3.2 On the Practical Meaning of Relative Probability.- 9.3.3 Semantic Entropy of a Real-Valued Variable.- 9.3.4 Relative Probability of Deterministic Events.- 9.3.5 Relative Probability of the Impossible Event.- 9.4 Composition Laws for Relative Probability.- 9.4.1 Relative Probability of the Complement of an Event.- 9.4.2 Relative Probability of a Deterministic Event.- 9.4.3 Relative Probability of Intersection of Events.- 9.4.4 Relative Probability of the Union of Events.- 9.5 Generalized Maximum Likelihood Criterion.- 9.5.1 The Problem.- 9.5.2 Generalized Maximum Likelihood Criterion.- 9.5.3 Practical Meaning of the Generalized Maximum Likelihood Criterion.- 9.5.4 A Simplified Criterion.- 9.5.5 Another Simplified Decision Criterion.- 9.5.6 Comparison with Another Criterion.- 9.6 Generalized Bayesian Criterion.- 9.6.1 Pattern Recognition.- 9.6.2 Formulation of the Statistical Decision Problem.- 9.6.3 Sharp Decision Rule.- 9.6.4 Soft Decision Rule.- 9.7 Generalized Bayesian Decision and Path Integrals.- 9.7.1 Formulation of the Problem.- 9.7.2 The Main Assumptions and Solution.- 9.8 Error Probability and Generalized Maximum Likelihood.- 9.8.1 Definition of the Problem.- 9.8.2 Application of the Generalized Maximum Likelihood Criterion.- 9.8.3 Application of the Generalized Bayesian Rule.- 9.9 A Pattern Recognition Problem.- 9.9.1 An Illustrative Example.- 9.9.2 Application of the Sharp Decision Rule.- 9.9.3 Application of the Soft Decision Rule.- 9.10 Concluding Remarks.- 10. Concluding Remarks and Outlook.- References.
「Nielsen BookData」 より