Cargando…

Information Theory Meets Power Laws Stochastic Processes and Language Models.

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Dębowski, Łukasz Jerzy, $$d 1975- $$e author
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated, 2020.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright
  • Contents
  • Preface
  • Acknowledgments
  • Basic Notations
  • Chapter 1 Guiding Ideas
  • 1.1 The Motivating Question
  • 1.2 Further Questions About Texts
  • 1.3 Zipf's and Herdan's Laws
  • 1.4 Markov and Finite-State Processes
  • 1.5 More General Stochastic Processes
  • 1.6 Two Interpretations of Probability
  • 1.7 Insights from Information Theory
  • 1.8 Estimation of Entropy Rate
  • 1.9 Entropy of Natural Language
  • 1.10 Algorithmic Information Theory
  • 1.11 Descriptions of a Random World
  • 1.12 Facts and Words Related
  • 1.13 Repetitions and Entropies
  • 4.3 Ergodic and Mixing Criteria
  • 4.4 Ergodic Decomposition
  • Problems
  • Chapter 5 Entropy and Information
  • 5.1 Shannon Measures for Partitions
  • 5.2 Block Entropy and Its Limits
  • 5.3 Shannon Measures for Fields
  • 5.4 Block Entropy Limits Revisited
  • 5.5 Convergence of Entropy
  • 5.6 Entropy as Self-Information
  • Problems
  • Chapter 6 Equipartition and Universality
  • 6.1 SMB Theorem
  • 6.2 Universal Semidistributions
  • 6.3 PPM Probability
  • 6.4 SMB Theorem Revisited
  • 6.5 PPM-based Statistics
  • Problems
  • Chapter 7 Coding and Computation
  • 7.1 Elements of Coding
  • 7.2 Kolmogorov Complexity
  • 7.3 Algorithmic Coding Theorems
  • 7.4 Limits of Mathematics
  • 7.5 Algorithmic Randomness
  • Problems
  • Chapter 8 Power Laws for Information
  • 8.1 Hilberg Exponents
  • 8.2 Second Order SMB Theorem
  • 8.3 Probabilistic and Algorithmic Facts
  • 8.4 Theorems About Facts and Words
  • Problems
  • Chapter 9 Power Laws for Repetitions
  • 9.1 Rényi-Arimoto Entropies
  • 9.2 Generalized Entropy Rates
  • 9.3 Recurrence Times
  • 9.4 Subword Complexity
  • 9.5 Two Maximal Lengths
  • 9.6 Logarithmic Power Laws
  • Problems
  • Chapter 10 AMS Processes
  • 10.1 AMS and Pseudo AMS Measures
  • 10.2 Quasiperiodic Coding
  • 10.3 Synchronizable Coding
  • 10.4 Entropy Rate in the AMS Case
  • Problems
  • Chapter 11 Toy Examples
  • 11.1 Finite and Ultrafinite Energy
  • 11.2 Santa Fe Processes and Alike
  • 11.3 Encoding into a Finite Alphabet
  • 11.4 Random Hierarchical Association
  • 11.5 Toward Better Models
  • Problems
  • Future Research
  • Bibliography
  • Index
  • EULA