Cargando…

Deep learning through sparse and low-rank modeling /

Deep Learning through Sparse Representation and Low-Rank Modeling bridges classical sparse and low rank models-those that emphasize problem-specific Interpretability-with recent deep network models that have enabled a larger learning capacity and better utilization of Big Data. It shows how the tool...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Otros Autores: Wang, Zhangyang (Editor ), Fu, Yun (Editor ), Huang, Thomas S., 1936- (Editor )
Formato: Electrónico eBook
Idioma:Inglés
Publicado: [Place of publication not identified] : Academic Press, an imprint of Elsevier, [2019]
Colección:Computer vision and pattern recognition series.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Front Cover; Deep Learning Through Sparse and Low-Rank Modeling; Copyright; Contents; Contributors; About the Editors; Preface; Acknowledgments; 1 Introduction; 1.1 Basics of Deep Learning; 1.2 Basics of Sparsity and Low-Rankness; 1.3 Connecting Deep Learning to Sparsity and Low-Rankness; 1.4 Organization; References; 2 Bi-Level Sparse Coding: A Hyperspectral Image Classi cation Example; 2.1 Introduction; 2.2 Formulation and Algorithm; 2.2.1 Notations; 2.2.2 Joint Feature Extraction and Classi cation; 2.2.2.1 Sparse Coding for Feature Extraction
  • 2.2.2.2 Task-Driven Functions for Classi cation2.2.2.3 Spatial Laplacian Regularization; 2.2.3 Bi-level Optimization Formulation; 2.2.4 Algorithm; 2.2.4.1 Stochastic Gradient Descent; 2.2.4.2 Sparse Reconstruction; 2.3 Experiments; 2.3.1 Classi cation Performance on AVIRIS Indiana Pines Data; 2.3.2 Classi cation Performance on AVIRIS Salinas Data; 2.3.3 Classi cation Performance on University of Pavia Data; 2.4 Conclusion; 2.5 Appendix; References; 3 Deep l0 Encoders: A Model Unfolding Example; 3.1 Introduction; 3.2 Related Work; 3.2.1 l0- and l1-Based Sparse Approximations
  • 3.2.2 Network Implementation of l1-Approximation3.3 Deep l0 Encoders; 3.3.1 Deep l0-Regularized Encoder; 3.3.2 Deep M-Sparse l0 Encoder; 3.3.3 Theoretical Properties; 3.4 Task-Driven Optimization; 3.5 Experiment; 3.5.1 Implementation; 3.5.2 Simulation on l0 Sparse Approximation; 3.5.3 Applications on Classi cation; 3.5.4 Applications on Clustering; 3.6 Conclusions and Discussions on Theoretical Properties; References; 4 Single Image Super-Resolution: From Sparse Coding to Deep Learning; 4.1 Robust Single Image Super-Resolution via Deep Networks with Sparse Prior; 4.1.1 Introduction
  • 4.1.2 Related Work4.1.3 Sparse Coding Based Network for Image SR; 4.1.3.1 Image SR Using Sparse Coding; 4.1.3.2 Network Implementation of Sparse Coding; 4.1.3.3 Network Architecture of SCN; 4.1.3.4 Advantages over Previous Models; 4.1.4 Network Cascade for Scalable SR; 4.1.4.1 Network Cascade for SR of a Fixed Scaling Factor; 4.1.4.2 Network Cascade for Scalable SR; 4.1.4.3 Training Cascade of Networks; 4.1.5 Robust SR for Real Scenarios; 4.1.5.1 Data-Driven SR by Fine-Tuning; 4.1.5.2 Iterative SR with Regularization; Blurry Image Upscaling; Noisy Image Upscaling; 4.1.6 Implementation Details
  • 4.1.7 Experiments4.1.7.1 Algorithm Analysis; 4.1.7.2 Comparison with State-of-the-Art; 4.1.7.3 Robustness to Real SR Scenarios; Data-Driven SR by Fine-Tuning; Regularized Iterative SR; 4.1.8 Subjective Evaluation; 4.1.9 Conclusion and Future Work; 4.2 Learning a Mixture of Deep Networks for Single Image Super-Resolution; 4.2.1 Introduction; 4.2.2 The Proposed Method; 4.2.3 Implementation Details; 4.2.4 Experimental Results; 4.2.4.1 Network Architecture Analysis; 4.2.4.2 Comparison with State-of-the-Art; 4.2.4.3 Runtime Analysis; 4.2.5 Conclusion and Future Work; References