|
|
|
|
LEADER |
00000cam a2200000Mu 4500 |
001 |
EBOOKCENTRAL_on1204139884 |
003 |
OCoLC |
005 |
20240329122006.0 |
006 |
m o d |
007 |
cr ||||||||||| |
008 |
201107s2020 xx o ||| 0 eng d |
040 |
|
|
|a EBLCP
|b eng
|c EBLCP
|d EBLCP
|d UX0
|d REDDC
|d OCLCF
|d OCLCQ
|d OCLCO
|d OCLCL
|
020 |
|
|
|a 9781119625360
|
020 |
|
|
|a 111962536X
|
035 |
|
|
|a (OCoLC)1204139884
|
050 |
|
4 |
|a P98
|b .D436 2021
|
082 |
0 |
4 |
|a 410.15195
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Dębowski, Łukasz Jerzy, $$d 1975- $$e author.
|
245 |
1 |
0 |
|a Information Theory Meets Power Laws
|h [electronic resource] :
|b Stochastic Processes and Language Models.
|c Łukasz Dębowski, Polish Academy of Sciences.
|
260 |
|
|
|a Newark :
|b John Wiley & Sons, Incorporated,
|c 2020.
|
300 |
|
|
|a 1 online resource (387 p.)
|
500 |
|
|
|a Description based upon print version of record.
|
505 |
0 |
|
|a Cover -- Title Page -- Copyright -- Contents -- Preface -- Acknowledgments -- Basic Notations -- Chapter 1 Guiding Ideas -- 1.1 The Motivating Question -- 1.2 Further Questions About Texts -- 1.3 Zipf's and Herdan's Laws -- 1.4 Markov and Finite-State Processes -- 1.5 More General Stochastic Processes -- 1.6 Two Interpretations of Probability -- 1.7 Insights from Information Theory -- 1.8 Estimation of Entropy Rate -- 1.9 Entropy of Natural Language -- 1.10 Algorithmic Information Theory -- 1.11 Descriptions of a Random World -- 1.12 Facts and Words Related -- 1.13 Repetitions and Entropies
|
505 |
8 |
|
|a 4.3 Ergodic and Mixing Criteria -- 4.4 Ergodic Decomposition -- Problems -- Chapter 5 Entropy and Information -- 5.1 Shannon Measures for Partitions -- 5.2 Block Entropy and Its Limits -- 5.3 Shannon Measures for Fields -- 5.4 Block Entropy Limits Revisited -- 5.5 Convergence of Entropy -- 5.6 Entropy as Self-Information -- Problems -- Chapter 6 Equipartition and Universality -- 6.1 SMB Theorem -- 6.2 Universal Semidistributions -- 6.3 PPM Probability -- 6.4 SMB Theorem Revisited -- 6.5 PPM-based Statistics -- Problems -- Chapter 7 Coding and Computation -- 7.1 Elements of Coding
|
505 |
8 |
|
|a 7.2 Kolmogorov Complexity -- 7.3 Algorithmic Coding Theorems -- 7.4 Limits of Mathematics -- 7.5 Algorithmic Randomness -- Problems -- Chapter 8 Power Laws for Information -- 8.1 Hilberg Exponents -- 8.2 Second Order SMB Theorem -- 8.3 Probabilistic and Algorithmic Facts -- 8.4 Theorems About Facts and Words -- Problems -- Chapter 9 Power Laws for Repetitions -- 9.1 Rényi-Arimoto Entropies -- 9.2 Generalized Entropy Rates -- 9.3 Recurrence Times -- 9.4 Subword Complexity -- 9.5 Two Maximal Lengths -- 9.6 Logarithmic Power Laws -- Problems -- Chapter 10 AMS Processes
|
505 |
8 |
|
|a 10.1 AMS and Pseudo AMS Measures -- 10.2 Quasiperiodic Coding -- 10.3 Synchronizable Coding -- 10.4 Entropy Rate in the AMS Case -- Problems -- Chapter 11 Toy Examples -- 11.1 Finite and Ultrafinite Energy -- 11.2 Santa Fe Processes and Alike -- 11.3 Encoding into a Finite Alphabet -- 11.4 Random Hierarchical Association -- 11.5 Toward Better Models -- Problems -- Future Research -- Bibliography -- Index -- EULA
|
590 |
|
|
|a ProQuest Ebook Central
|b Ebook Central Academic Complete
|
650 |
|
0 |
|a Computational linguistics.
|
650 |
|
6 |
|a Linguistique informatique.
|
650 |
|
7 |
|a computational linguistics.
|2 aat
|
650 |
|
7 |
|a Computational linguistics
|2 fast
|
758 |
|
|
|i has work:
|a Information theory meets power laws (Text)
|1 https://id.oclc.org/worldcat/entity/E39PCGJXFPtwwvXHHk9hyPc6w3
|4 https://id.oclc.org/worldcat/ontology/hasWork
|
776 |
0 |
8 |
|i Print version:
|a Debowski, Lukasz
|t Information Theory Meets Power Laws : Stochastic Processes and Language Models
|d Newark : John Wiley & Sons, Incorporated,c2020
|z 9781119625278
|
856 |
4 |
0 |
|u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=6384369
|z Texto completo
|
938 |
|
|
|a ProQuest Ebook Central
|b EBLB
|n EBL6384369
|
994 |
|
|
|a 92
|b IZTAP
|