|
|
|
|
LEADER |
00000cam a2200000Mi 4500 |
001 |
EBOOKCENTRAL_ocn859886714 |
003 |
OCoLC |
005 |
20240329122006.0 |
006 |
m o d |
007 |
cr ||||||||||| |
008 |
100122s2010 njua fob 001 0 eng d |
040 |
|
|
|a NLE
|b eng
|e pn
|c NLE
|d DEBSZ
|d OCLCO
|d OCLCQ
|d MERUC
|d U3W
|d OCLCF
|d ICG
|d INT
|d OCLCQ
|d DKC
|d OCLCQ
|d HS0
|d OCLCO
|d OCLCQ
|d OCLCO
|d OCLCL
|
020 |
|
|
|a 9814271071
|
020 |
|
|
|a 9789814271073
|
029 |
1 |
|
|a AU@
|b 000055921200
|
029 |
1 |
|
|a DEBBG
|b BV044178887
|
029 |
1 |
|
|a DEBSZ
|b 405718470
|
029 |
1 |
|
|a DEBSZ
|b 445579455
|
035 |
|
|
|a (OCoLC)859886714
|
050 |
|
4 |
|a TK7882.P3 R65 2010
|
082 |
0 |
4 |
|a 006.40151
|2 22
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Rokach, Lior.
|
245 |
1 |
0 |
|a Pattern classification using ensemble methods /
|c Lior Rokach.
|
260 |
|
|
|a Hackensack, N.J. ;
|a London :
|b World Scientific,
|c ©2010.
|
300 |
|
|
|a 1 online resource (xv, 225 pages) :
|b illustrations.
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
490 |
1 |
|
|a Series in machine perception and artificial intelligence ;
|v v. 75
|
504 |
|
|
|a Includes bibliographical references and index.
|
505 |
0 |
|
|a Preface; 1. Introduction to Pattern Classification; 1.1 Pattern Classification; 1.2 Induction Algorithms; 1.3 Rule Induction; 1.4 Decision Trees; 1.5 Bayesian Methods; 1.5.1 Overview.; 1.5.2 Naıve Bayes; 1.5.2.1 The Basic Naıve Bayes Classifier; 1.5.2.2 Naıve Bayes Induction for Numeric Attributes; 1.5.2.3 Correction to the Probability Estimation; 1.5.2.4 Laplace Correction; 1.5.2.5 No Match; 1.5.3 Other Bayesian Methods; 1.6 Other Induction Methods; 1.6.1 Neural Networks; 1.6.2 Genetic Algorithms; 1.6.3 Instance-based Learning; 1.6.4 Support Vector Machines
|
505 |
8 |
|
|a 2. Introduction to Ensemble Learning 2.1 Back to the Roots; 2.2 The Wisdom of Crowds; 2.3 The Bagging Algorithm; 2.4 The Boosting Algorithm; 2.5 The Ada Boost Algorithm; 2.6 No Free Lunch Theorem and Ensemble Learning; 2.7 Bias-Variance Decomposition and Ensemble Learning; 2.8 Occam's Razor and Ensemble Learning; 2.9 Classifier Dependency; 2.9.1 Dependent Methods; 2.9.1.1 Model-guided Instance Selection; 2.9.1.2 Basic Boosting Algorithms; 2.9.1.3 Advanced Boosting Algorithms; 2.9.1.4 Incremental Batch Learning; 2.9.2 Independent Methods; 2.9.2.1 Bagging; 2.9.2.2 Wagging
|
505 |
8 |
|
|a 2.9.2.3 Random Forest and Random Subspace Projection 2.9.2.4 Non-Linear Boosting Projection (NLBP); 2.9.2.5 Cross-validated Committees; 2.9.2.6 Robust Boosting; 2.10 Ensemble Methods for Advanced Classification Tasks; 2.10.1 Cost-Sensitive Classification; 2.10.2 Ensemble for Learning Concept Drift; 2.10.3 Reject Driven Classification; 3. Ensemble Classification; 3.1 Fusions Methods; 3.1.1 Weighting Methods; 3.1.2 Majority Voting; 3.1.3 Performance Weighting; 3.1.4 Distribution Summation; 3.1.5 Bayesian Combination; 3.1.6 Dempster-Shafer; 3.1.7 Vogging; 3.1.8 Naıve Bayes
|
505 |
8 |
|
|a 3.1.9 Entropy Weighting 3.1.10 Density-based Weighting; 3.1.11 DEA Weighting Method; 3.1.12 Logarithmic Opinion Pool; 3.1.13 Order Statistics; 3.2 Selecting Classification; 3.2.1 Partitioning the Instance Space; 3.2.1.1 The K-Means Algorithm as a Decomposition Tool; 3.2.1.2 Determining the Number of Subsets; 3.2.1.3 The Basic K-Classifier Algorithm; 3.2.1.4 The Heterogeneity Detecting K-Classifier (HDK-Classifier); 3.2.1.5 Running-Time Complexity; 3.3 Mixture of Experts and Meta Learning; 3.3.1 Stacking; 3.3.2 Arbiter Trees; 3.3.3 Combiner Trees; 3.3.4 Grading; 3.3.5 Gating Network
|
505 |
8 |
|
|a 4. Ensemble Diversity 4.1 Overview; 4.2 Manipulating the Inducer; 4.2.1 Manipulation of the Inducer's Parameters; 4.2.2 Starting Point in Hypothesis Space; 4.2.3 Hypothesis Space Traversal; 4.3 Manipulating the Training Samples; 4.3.1 Resampling; 4.3.2 Creation; 4.3.3 Partitioning; 4.4 Manipulating the Target Attribute Representation; 4.4.1 Label Switching; 4.5 Partitioning the Search Space; 4.5.1 Divide and Conquer; 4.5.2 Feature Subset-based Ensemble Methods; 4.5.2.1 Random-based Strategy; 4.5.2.2 Reduct-based Strategy; 4.5.2.3 Collective-Performance-based Strategy
|
520 |
|
|
|a Researchers from various disciplines such as pattern recognition, statistics, and machine learning have explored the use of ensemble methodology since the late seventies. Thus, they are faced with a wide variety of methods, given the growing interest in the field. This book aims to impose a degree of order upon this diversity by presenting a coherent and unified repository of ensemble methods, theories, trends, challenges and applications. The book describes in detail the classical methods, as well as the extensions and novel approaches developed recently. Along with algorithmic descriptions.
|
590 |
|
|
|a ProQuest Ebook Central
|b Ebook Central Academic Complete
|
650 |
|
0 |
|a Pattern recognition systems
|x Mathematics.
|
650 |
|
0 |
|a Set theory.
|
650 |
|
6 |
|a Reconnaissance des formes (Informatique)
|x Mathématiques.
|
650 |
|
6 |
|a Théorie des ensembles.
|
650 |
|
7 |
|a Set theory
|2 fast
|
758 |
|
|
|i has work:
|a Pattern classification using ensemble methods (Text)
|1 https://id.oclc.org/worldcat/entity/E39PCFY3y94KtPVkRc4fJKtMyd
|4 https://id.oclc.org/worldcat/ontology/hasWork
|
776 |
0 |
|
|c Hardback
|z 9789814271066
|
830 |
|
0 |
|a Series in machine perception and artificial intelligence ;
|v v. 75.
|
856 |
4 |
0 |
|u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=1679487
|z Texto completo
|
994 |
|
|
|a 92
|b IZTAP
|