Cargando…

On measures of information and their characterizations /

On measures of information and their characterizations.

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Acz�el, J.
Otros Autores: Dar�oczy, Zolt�an
Formato: Electrónico eBook
Idioma:Inglés
Publicado: New York : Academic Press, 1975.
Colección:Mathematics in science and engineering ; v. 115.
Temas:
Acceso en línea:Texto completo
Texto completo
Texto completo
Tabla de Contenidos:
  • Front Cover; On Measures of Information and Their Characterizations; Copyright Page; Contents; Preface; Chapter 0. Introduction. Entropy of a Single Event. Functional Equations; 0.1 Introduction, History; 0.2 Entropy of One Event; 0.3 The Cauchy Functional Equation and Related Equations; 0.4 Completely Additive Number Theoretical Functions; Chapter 1. Shannon's Measure of Information; 1.1 Shannon's Entropy; 1.2 Algebraic Properties of the Shannon Entropy; 1.3 Analytic Properties of the Shannon Entropy. Inequalities
  • 1.4 Some Simple Examples of Applications of Entropies to Logical Games and Coding1.5 Coding; 1.6 Optimal Coding and Entropy; Chapter 2. Some Desirable Properties of Entropies and Their Correlations. The Hincin and Faddeev Characterizations of Shannon's Entropy; 2.1 A List of Desiderata for Entropies; 2.2 Generalized Representations; 2.3 Correlations and Simple Consequences of the Desiderata; 2.4 The Shannon-Hincin and the Faddeev Characterizations of the Shannon Entropy; Chapter 3. The Fundamental Equation of Information; 3.1 Information Functions
  • 3.2 Information Functions Continuous at the Origin-Entropies Which Are Small for Small Probabilities3.3 Nonnegative Bounded Information Functions and Entropies; 3.4 Measurable Information Functions and Entropies; 3.5 The General Solution of the Fundamental Equation of Information; Chapter 4. Further Characterizations of tbe Shannon Entropy; 4.1 The Branching Property; 4.2 Entropies with the Sum Property; 4.3 Characterization of the Shannon Entropy by the Shannon Inequality; 4.4 Subadditive Additive Entropies; Chapter 5. R�enyi Entropies; 5.1 Entropies and Mean Values
  • 5.2 Equality of Average Probabilities. Additivity of Average Entropies5.3 R�enyi Entropies for Possibly Incomplete Distributions; 5.4 Optimal Coding and R�enyi Entropies; 5.5 Characterization of �-Averages; Chapter 6. Generalized Information Functions; 6.1 Relative Information; 6.2 Information Functions of Degree a; 6.3 Generalized Entropies; Chapter 7. Further Measures of Information; 7.1 Further Entropies. Other Measures of Information Derived Algebraically from Entropies; 7.2 Further Measures of Information Depending on Two Probability Distributions
  • 7.3 Some Measures of Information Depending on Three DistributionsReferences; Author Index; Subject Index