Cargando…

Conformal prediction for reliable machine learning : theory, adaptations, and applications /

"Traditional, low-dimensional, small scale data have been successfully dealt with using conventional software engineering and classical statistical methods, such as discriminant analysis, neural networks, genetic algorithms and others. But the change of scale in data collection and the dimensio...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Otros Autores: Balasubramanian, Vineeth (Editor ), Ho, Shen-Shyang (Editor ), Vovk, Vladimir, 1960- (Editor )
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Amsterdam ; Boston : Morgan Kaufmann, �2014.
Temas:
Acceso en línea:Texto completo

MARC

LEADER 00000cam a2200000 a 4500
001 SCIDIR_ocn878922864
003 OCoLC
005 20231120111600.0
006 m o d
007 cr cnu---unuuu
008 140502s2014 ne ob 001 0 eng d
040 |a IDEBK  |b eng  |e pn  |c IDEBK  |d N$T  |d YDXCP  |d OPELS  |d E7B  |d UIU  |d CDX  |d OCLCF  |d TPH  |d B24X7  |d COO  |d MFS  |d RIV  |d OCLCQ  |d OCLCO  |d OCLCQ  |d LIV  |d OCLCQ  |d U3W  |d D6H  |d INT  |d OTZ  |d AU@  |d OCLCQ  |d WYU  |d TKN  |d VT2  |d LQU  |d OCLCQ  |d UKAHL  |d BRF  |d OCLCO  |d OCLCQ  |d OCLCO 
019 |a 1066018126  |a 1105196318  |a 1105571256  |a 1129367580 
020 |a 9780124017153  |q (electronic bk.) 
020 |a 0124017150  |q (electronic bk.) 
020 |a 1306697484  |q (electronic bk.) 
020 |a 9781306697484  |q (electronic bk.) 
020 |z 9780123985378 
020 |z 0123985374 
035 |a (OCoLC)878922864  |z (OCoLC)1066018126  |z (OCoLC)1105196318  |z (OCoLC)1105571256  |z (OCoLC)1129367580 
050 4 |a Q325.5  |b C668 2014eb 
060 4 |a Online Book 
072 7 |a COM  |x 000000  |2 bisacsh 
082 0 4 |a 006.3/1  |2 23 
245 0 0 |a Conformal prediction for reliable machine learning :  |b theory, adaptations, and applications /  |c [edited by] Vineeth Balasubramanian, Shen-Shyang Ho, Vladimir Vovk. 
260 |a Amsterdam ;  |a Boston :  |b Morgan Kaufmann,  |c �2014. 
300 |a 1 online resource 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
520 |a "Traditional, low-dimensional, small scale data have been successfully dealt with using conventional software engineering and classical statistical methods, such as discriminant analysis, neural networks, genetic algorithms and others. But the change of scale in data collection and the dimensionality of modern data sets has profound implications on the type of analysis that can be done. Recently several kernel-based machine learning algorithms have been developed for dealing with high-dimensional problems, where a large number of features could cause a combinatorial explosion. These methods are quickly gaining popularity, and it is widely believed that they will help to meet the challenge of analysing very large data sets. Learning machines often perform well in a wide range of applications and have nice theoretical properties without requiring any parametric statistical assumption about the source of data (unlike traditional statistical techniques). However, a typical drawback of many machine learning algorithms is that they usually do not provide any useful measure of confidence in the predicted labels of new, unclassifed examples. Confidence estimation is a well-studied area of both parametric and non-parametric statistics; however, usually only low-dimensional problems are considered"--  |c Provided by publisher 
504 |a Includes bibliographical references and index. 
588 0 |a Print version record. 
505 0 |a Half Title; Title Page; Copyright; Copyright Permissions; Contents; Contributing Authors; Foreword; Preface; Book Organization; Part I: Theory; Part II: Adaptations; Part III: Applications; Companion Website; Contacting Us; Acknowledgments; Part I: Theory; 1 The Basic Conformal Prediction Framework; 1.1 The Basic Setting and Assumptions; 1.2 Set and Confidence Predictors; 1.2.1 Validity and Efficiency of Set and Confidence Predictors; 1.3 Conformal Prediction; 1.3.1 The Binary Case; 1.3.2 The Gaussian Case; 1.4 Efficiency in the Case of Prediction without Objects. 
505 8 |a 1.5 Universality of Conformal Predictors1.6 Structured Case and Classification; 1.7 Regression; 1.8 Additional Properties of Validity and Efficiency in the Online Framework; 1.8.1 Asymptotically Efficient Conformal Predictors; Acknowledgments; 2 Beyond the Basic Conformal Prediction Framework; 2.1 Conditional Validity; 2.2 Conditional Conformal Predictors; 2.2.1 Venn's Dilemma; 2.3 Inductive Conformal Predictors; 2.3.1 Conditional Inductive Conformal Predictors; 2.4 Training Conditional Validity of Inductive Conformal Predictors; 2.5 Classical Tolerance Regions. 
505 8 |a 2.6 Object Conditional Validity and Efficiency2.6.1 Negative Result; 2.6.2 Positive Results; 2.7 Label Conditional Validity and ROC Curves; 2.8 Venn Predictors; 2.8.1 Inductive Venn Predictors; 2.8.2 Venn Prediction without Objects; Acknowledgments; Part II: Adaptations; 3 Active Learning; 3.1 Introduction; 3.2 Background and Related Work; 3.2.1 Pool-based Active Learning with Serial Query; SVM-based methods; Statistical methods; Ensemble-based methods; Other methods; 3.2.2 Batch Mode Active Learning; 3.2.3 Online Active Learning; 3.3 Active Learning Using Conformal Prediction. 
505 8 |a 3.3.1 Query by Transduction (QBT)Algorithmic formulation; 3.3.2 Generalized Query by Transduction; Algorithmic formulation; Combining multiple criteria in GQBT; 3.3.3 Multicriteria Extension to QBT; 3.4 Experimental Results; 3.4.1 Benchmark Datasets; 3.4.2 Application to Face Recognition; 3.4.3 Multicriteria Extension to QBT; 3.5 Discussion and Conclusions; Acknowledgments; 4 Anomaly Detection; 4.1 Introduction; 4.2 Background; 4.3 Conformal Prediction for Multiclass Anomaly Detection; 4.3.1 A Nonconformity Measure for Multiclass Anomaly Detection; 4.4 Conformal Anomaly Detection. 
505 8 |a 4.4.1 Conformal Anomalies4.4.2 Offline versus Online Conformal Anomaly Detection; 4.4.3 Unsupervised and Semi-supervised Conformal Anomaly Detection; 4.4.4 Classification Performance and Tuning of the Anomaly Threshold; 4.5 Inductive Conformal Anomaly Detection; 4.5.1 Offline and Semi-Offline Inductive Conformal Anomaly Detection; 4.5.2 Online Inductive Conformal Anomaly Detection; 4.6 Nonconformity Measures for Examples Represented as Sets of Points; 4.6.1 The Directed Hausdorff Distance; 4.6.2 The Directed Hausdorff k-Nearest Neighbors Nonconformity Measure. 
650 0 |a Machine learning. 
650 2 |a Machine Learning  |0 (DNLM)D000069550 
650 6 |a Apprentissage automatique.  |0 (CaQQLa)201-0131435 
650 7 |a COMPUTERS  |x General.  |2 bisacsh 
650 7 |a Machine learning  |2 fast  |0 (OCoLC)fst01004795 
700 1 |a Balasubramanian, Vineeth,  |e editor. 
700 1 |a Ho, Shen-Shyang,  |e editor. 
700 1 |a Vovk, Vladimir,  |d 1960-  |e editor. 
776 0 8 |i Print version:  |t Conformal prediction for reliable machine learning.  |d Amsterdam ; Boston : Morgan Kaufmann, 2014  |z 9780123985378 
856 4 0 |u https://sciencedirect.uam.elogim.com/science/book/9780123985378  |z Texto completo