Information Theoretic Learning Renyi's Entropy and Kernel Perspectives /
This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, cor...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Autor Corporativo: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
New York, NY :
Springer New York : Imprint: Springer,
2010.
|
Edición: | 1st ed. 2010. |
Colección: | Information Science and Statistics,
|
Temas: | |
Acceso en línea: | Texto Completo |
Tabla de Contenidos:
- Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces
- Renyi's Entropy, Divergence and Their Nonparametric Estimators
- Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria
- Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems
- Nonlinear Adaptive Filtering with MEE, MCC, and Applications
- Classification with EEC, Divergence Measures, and Error Bounds
- Clustering with ITL Principles
- Self-Organizing ITL Principles for Unsupervised Learning
- A Reproducing Kernel Hilbert Space Framework for ITL
- Correntropy for Random Variables: Properties and Applications in Statistical Inference
- Correntropy for Random Processes: Properties and Applications in Signal Processing.