|
|
|
|
LEADER |
00000nam a22000005i 4500 |
001 |
978-0-387-31240-8 |
003 |
DE-He213 |
005 |
20220120150659.0 |
007 |
cr nn 008mamaa |
008 |
100301s2006 xxu| s |||| 0|eng d |
020 |
|
|
|a 9780387312408
|9 978-0-387-31240-8
|
024 |
7 |
|
|a 10.1007/0-387-31240-4
|2 doi
|
050 |
|
4 |
|a QA75.5-76.95
|
072 |
|
7 |
|a UYA
|2 bicssc
|
072 |
|
7 |
|a COM014000
|2 bisacsh
|
072 |
|
7 |
|a UYA
|2 thema
|
082 |
0 |
4 |
|a 004.0151
|2 23
|
100 |
1 |
|
|a Nikolaev, Nikolay.
|e author.
|4 aut
|4 http://id.loc.gov/vocabulary/relators/aut
|
245 |
1 |
0 |
|a Adaptive Learning of Polynomial Networks
|h [electronic resource] :
|b Genetic Programming, Backpropagation and Bayesian Methods /
|c by Nikolay Nikolaev, Hitoshi Iba.
|
250 |
|
|
|a 1st ed. 2006.
|
264 |
|
1 |
|a New York, NY :
|b Springer US :
|b Imprint: Springer,
|c 2006.
|
300 |
|
|
|a XIV, 316 p.
|b online resource.
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
347 |
|
|
|a text file
|b PDF
|2 rda
|
490 |
1 |
|
|a Genetic and Evolutionary Computation,
|x 1932-0175
|
505 |
0 |
|
|a Inductive Genetic Programming -- Tree-Like PNN Representations -- Fitness Functions and Landscapes -- Search Navigation -- Backpropagation Techniques -- Temporal Backpropagation -- Bayesian Inference Techniques -- Statistical Model Diagnostics -- Time Series Modelling -- Conclusions.
|
520 |
|
|
|a This book provides theoretical and practical knowledge for develop ment of algorithms that infer linear and nonlinear models. It offers a methodology for inductive learning of polynomial neural network mod els from data. The design of such tools contributes to better statistical data modelling when addressing tasks from various areas like system identification, chaotic time-series prediction, financial forecasting and data mining. The main claim is that the model identification process involves several equally important steps: finding the model structure, estimating the model weight parameters, and tuning these weights with respect to the adopted assumptions about the underlying data distrib ution. When the learning process is organized according to these steps, performed together one after the other or separately, one may expect to discover models that generalize well (that is, predict well). The book off'ers statisticians a shift in focus from the standard f- ear models toward highly nonlinear models that can be found by con temporary learning approaches. Speciafists in statistical learning will read about alternative probabilistic search algorithms that discover the model architecture, and neural network training techniques that identify accurate polynomial weights. They wfil be pleased to find out that the discovered models can be easily interpreted, and these models assume statistical diagnosis by standard statistical means. Covering the three fields of: evolutionary computation, neural net works and Bayesian inference, orients the book to a large audience of researchers and practitioners.
|
650 |
|
0 |
|a Computer science.
|
650 |
|
0 |
|a Artificial intelligence.
|
650 |
1 |
4 |
|a Theory of Computation.
|
650 |
2 |
4 |
|a Artificial Intelligence.
|
700 |
1 |
|
|a Iba, Hitoshi.
|e author.
|4 aut
|4 http://id.loc.gov/vocabulary/relators/aut
|
710 |
2 |
|
|a SpringerLink (Online service)
|
773 |
0 |
|
|t Springer Nature eBook
|
776 |
0 |
8 |
|i Printed edition:
|z 9780387511764
|
776 |
0 |
8 |
|i Printed edition:
|z 9781441940605
|
776 |
0 |
8 |
|i Printed edition:
|z 9780387312392
|
830 |
|
0 |
|a Genetic and Evolutionary Computation,
|x 1932-0175
|
856 |
4 |
0 |
|u https://doi.uam.elogim.com/10.1007/0-387-31240-4
|z Texto Completo
|
912 |
|
|
|a ZDB-2-SCS
|
912 |
|
|
|a ZDB-2-SXCS
|
950 |
|
|
|a Computer Science (SpringerNature-11645)
|
950 |
|
|
|a Computer Science (R0) (SpringerNature-43710)
|