|
|
|
|
LEADER |
00000cam a2200000 i 4500 |
001 |
OR_ocn892067383 |
003 |
OCoLC |
005 |
20231017213018.0 |
006 |
m o d |
007 |
cr cnu---unuuu |
008 |
141003s1993 maua fob 000 0 eng d |
040 |
|
|
|a OPELS
|b eng
|e rda
|e pn
|c OPELS
|d N$T
|d LIP
|d STF
|d OCLCQ
|d SFB
|d LUN
|d OCLCQ
|d OCLCO
|d OCLCQ
|
020 |
|
|
|a 9780080514338
|q (electronic bk.)
|
020 |
|
|
|a 0080514332
|q (electronic bk.)
|
020 |
|
|
|z 0124790402
|
020 |
|
|
|z 9780124790407
|
029 |
1 |
|
|a AU@
|b 000056933618
|
029 |
1 |
|
|a NZ1
|b 15922268
|
035 |
|
|
|a (OCoLC)892067383
|
050 |
|
4 |
|a QA76.87
|b M423 1993eb
|
072 |
|
7 |
|a COM
|x 000000
|2 bisacsh
|
082 |
0 |
4 |
|a 006.3
|2 22
|
084 |
|
|
|a *68N15
|2 msc
|
084 |
|
|
|a 68-01
|2 msc
|
084 |
|
|
|a 68T05
|2 msc
|
084 |
|
|
|a ST 250
|2 rvk
|
084 |
|
|
|a ST 250 C01
|2 rvk
|
084 |
|
|
|a ST 285
|2 rvk
|
084 |
|
|
|a ST 300
|2 rvk
|
084 |
|
|
|a DAT 717f
|2 stub
|
084 |
|
|
|a DAT 358f
|2 stub
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Masters, Timothy.
|
245 |
1 |
0 |
|a Practical neural network recipes in C++ /
|c Timothy Masters.
|
264 |
|
1 |
|a Boston :
|b Morgan Kaufmann,
|c [1993]
|
264 |
|
4 |
|c ©1993
|
300 |
|
|
|a 1 online resource (xviii, 493 pages) :
|b illustrations
|
300 |
|
|
|a 1 online resource (1 disc (3 1/2 in.))
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
500 |
|
|
|a Accompanied by Diskette (702000135).
|
504 |
|
|
|a Includes bibliographical references (pages 479-490).
|
520 |
|
|
|a This text serves as a cookbook for neural network solutions to practical problems using C++. It will enable those with moderate programming experience to select a neural network model appropriate to solving a particular problem, and to produce a working program implementing that network. The book provides guidance along the entire problem-solving path, including designing the training set, preprocessing variables, training and validating the network, and evaluating its performance. Though the book is not intended as a general course in neural networks, no background in neural works is assumed and all models are presented from the ground up.
|
588 |
0 |
|
|a Print version record.
|
505 |
0 |
|
|a Front Cover; Practical Neural Network Recipes in C++; Copyright Page; Dedication; Table of Contents; Preface; Chapter 1. Foundations; Motivation; New Life for Old Techniques; Perceptrons and Linear Separability; Neural Network Capabilities; Basic Structure of a Neural Network; Training; Validation; Chapter 2. Classification; Binary Decisions; Multiple Classes; Supervised versus Unsupervised Training; Chapter 3. Autoassociation; Autoassociative Filtering; Noise Reduction; Learning a Prototype from Exemplars; Exposing Isolated Events; Pattern Completion; Error Correction; Data Compression.
|
505 |
8 |
|
|a Chapter 4. Time-Series PredictionThe Basic Model; Input Data; Multiple Prediction; Multiple Predictors; Measuring Prediction Error; Chapter 5. Function Approximation; Univariate Function Approximation; Inverse Modeling; Multiple Regression; Chapter 6. Multilayer Feedforward Networks; Basic Architecture; Theoretical Discussion; Algorithms for Executing the Network; Training the Network; Training by Backpropagation of Errors; Training by Conjugate Gradients; Eluding Local Minima in Learning; When to Use a Multiple-Layer Feedforward Network; Chapter 7. Eluding Local Minima I: Simulated Annealing.
|
505 |
8 |
|
|a OverviewChoosing the Annealing Parameters; Implementation in Feedforward Network Learning; A Sample Program; A Sample Function; Random Number Generation; Going on from Here; Chapter 8. Eluding Local Minima II: Genetic Optimization; Overview; Designing the Genetic Structure; Evaluation; Parent Selection; Reproduction; Mutation; A Genetic Minimization Subroutine; Some Functions for Genetic Optimization; Advanced Topics in Genetic Optimization; Chapter 9. Regression and Neural Networks; Overview; Singular-Value Decomposition; Regression in Neural Networks.
|
505 |
8 |
|
|a Chapter 10. Designing Feedforward Network ArchitecturesHow Many Hidden Layers?; How Many Hidden Neurons?; How Long Do I Train This Thing???; Chapter 11. Interpreting Weights: How Does This Thing Work?; Features Used by Networks in General; Features Used by a Particular Network; Chapter 12. Probabilistic Neural Networks; Overview; Computational Aspects; Optimizing Sigma; A Sample Program; Bayesian Confidence Measures; Autoassociative Versions; When to Use a Probabilistic Neural Network; Chapter 13. Functional Link Networks; Application to Nonlinear Approximation.
|
505 |
8 |
|
|a Mathematics of the Functional Link NetworkWhen to Use a Functional Link Network; Chapter 14. Hybrid Networks; Functional Link Net as a Hidden Layer; Fast Bayesian Confidences; Attention-based Processing; Factorable Problems; Chapter 15. Designing the Training Set; Number of Samples; Borderline Cases; Hidden Bias; Balancing the Classes; Fudging Cases; Chapter 16. Preparing Input Data; General Considerations; Types of Measurements; Is Scaling Always Necessary?; Transformations; Circular Discontinuity; Outliers; Missing Data; Chapter 17. Fuzzy Data and Processing.
|
590 |
|
|
|a O'Reilly
|b O'Reilly Online Learning: Academic/Public Library Edition
|
650 |
|
0 |
|a Neural networks (Computer science)
|
650 |
|
0 |
|a C++ (Computer program language)
|
650 |
|
6 |
|a Réseaux neuronaux (Informatique)
|
650 |
|
6 |
|a C++ (Langage de programmation)
|
650 |
|
7 |
|a COMPUTERS
|x General.
|2 bisacsh
|
650 |
|
7 |
|a C++ (Computer program language)
|2 fast
|0 (OCoLC)fst00843286
|
650 |
|
7 |
|a Neural networks (Computer science)
|2 fast
|0 (OCoLC)fst01036260
|
776 |
0 |
8 |
|i Print version:
|a Masters, Timothy.
|t Practical neural network recipes in C++
|z 0124790402
|w (OCoLC)762079168
|
856 |
4 |
0 |
|u https://learning.oreilly.com/library/view/~/9780080514338/?ar
|z Texto completo (Requiere registro previo con correo institucional)
|
938 |
|
|
|a EBSCOhost
|b EBSC
|n 919356
|
994 |
|
|
|a 92
|b IZTAP
|