Neural nets and chaotic carriers /
Neural Nets and Chaotic Carriers develops rational principles for the design of associative memories, with a view to applying these principles to models with irregularly oscillatory operation so evident in biological neural systems, and necessitated by the meaninglessness of absolute signal levels....
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
London : Hackensack, NJ :
Imperial College Press ; Distributed by World Scientific Pub.,
©2010.
|
Edición: | 2nd ed. |
Colección: | Advances in computer science and engineering. Texts ;
v. 5. |
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- 1. Introduction and aspirations
- 2. Optimal statistical procedures. 2.1. The optimisation of actions. 2.2. Effective estimation of state. 2.3. The quadratic/Gaussian case : estimation and certainty equivalence. 2.4 The linear model, in Bayesian and classic versions
- 3. Linear links and nonlinear knots : The basic neural net. 3.1. Neural calculations : The linear gate and the McCulloch-Pitts net. 3.2. Sigmoid and threshold functions. 3.3. Iteration. 3.4. Neural systems and feedback in continuous time. 3.5. Equilibrium excitation patterns. 3.6. Some special-purpose nets
- 4. Bifurcations and chaos. 4.1. The Hopf bifurcation. 4.2. Chaos
- 5. What is a memory? The Hamming and Hopfield nets. 5.1. Associative memories. 5.2. The Hamming net. 5.3. Autoassociation, feedback and storage. 5.4. The Hopfield net. 5.5. Alternative formulations of the Hopfield net
- 6. Compound and 'spurious' traces. 6.1. Performance and trace structure. 6.2. The recognition of simple traces. 6.3. Inference for compound traces. 6.4. Network realisation of the quantised regression. 6.5. Reliability constraints for the quantised regression. 6.6. Stability constraints for the quantised regression. 6.7. The Hopfield net
- 7. Preserving plasticity : A Bayesian approach. 7.1. A Bayesian view. 7.2. A robust estimation method. 7.3. Dynamic and neural versions of the algorithm
- 8. The key task : the fixing of fading data. Conclusions I. 8.1. Fading data, and the need for quantisation. 8.2. The probability-maximising algorithm (PMA). 8.3. Properties of the vector activation function F(z). 8.4. Some special cases. 8.5. The network realisation of the full PMA. 8.6. Neural implementation of the PMA. 8.7. The PMA and the exponential family. 8.8. Conclusions I
- 9. Performance of the probability-maximising algorithm. 9.1. A general formulation. 9.2. Considerations for reliable inference. 9.3. Performance of the PMA for simple stimuli. 9.4. Compound stimuli : The general pattern. 9. 5. Compound stimuli in the Gaussian case
- 10. Other memories
- other considerations. 10.1. The supervised learning of a linear relation. 10.2. Unsupervised learning : The criterion of economy.