Entropy and Information Theory
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Autor Corporativo: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
New York, NY :
Springer US : Imprint: Springer,
2011.
|
Edición: | 2nd ed. 2011. |
Temas: | |
Acceso en línea: | Texto Completo |
Tabla de Contenidos:
- Preface
- Introduction
- Information Sources
- Pair Processes: Channels, Codes, and Couplings
- Entropy
- The Entropy Ergodic Theorem
- Distortion and Approximation
- Distortion and Entropy
- Relative Entropy
- Information Rates
- Distortion vs. Rate
- Relative Entropy Rates
- Ergodic Theorems for Densities
- Source Coding Theorems
- Coding for Noisy Channels
- Bibliography
- References
- Index.