| |
| |
Preface | |
| |
| |
| |
Introduction | |
| |
| |
| |
History of Neural Networks | |
| |
| |
| |
Structure and Function of a Single Neuron | |
| |
| |
| |
Biological neurons | |
| |
| |
| |
Artificial neuron models | |
| |
| |
| |
Neural Net Architectures | |
| |
| |
| |
Fully connected networks | |
| |
| |
| |
Layered networks | |
| |
| |
| |
Acyclic networks | |
| |
| |
| |
Feedforward networks | |
| |
| |
| |
Modular neural networks | |
| |
| |
| |
Neural Learning | |
| |
| |
| |
Correlation learning | |
| |
| |
| |
Competitive learning | |
| |
| |
| |
Feedback-based weight adaptation | |
| |
| |
| |
What Can Neural Networks Be Used for? | |
| |
| |
| |
Classification | |
| |
| |
| |
Clustering | |
| |
| |
| |
Vector quantization | |
| |
| |
| |
Pattern association | |
| |
| |
| |
Function approximation | |
| |
| |
| |
Forecasting | |
| |
| |
| |
Control applications | |
| |
| |
| |
Optimization | |
| |
| |
| |
Search | |
| |
| |
| |
Evaluation of Networks | |
| |
| |
| |
Quality of results | |
| |
| |
| |
Generalizability | |
| |
| |
| |
Computational resources | |
| |
| |
| |
Implementation | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Supervised Learning: Single-Layer Networks | |
| |
| |
| |
Perceptrons | |
| |
| |
| |
Linear Separability | |
| |
| |
| |
Perceptron Training Algorithm | |
| |
| |
| |
Termination criterion | |
| |
| |
| |
Choice of learning rate | |
| |
| |
| |
Non-numeric inputs | |
| |
| |
| |
Guarantee of Success | |
| |
| |
| |
Modifications | |
| |
| |
| |
Pocket algorithm | |
| |
| |
| |
Adalines | |
| |
| |
| |
Multiclass discrimination | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Supervised Learning: Multilayer Networks I | |
| |
| |
| |
Multilevel Discrimination | |
| |
| |
| |
Preliminaries | |
| |
| |
| |
Architecture | |
| |
| |
| |
Objectives | |
| |
| |
| |
Backpropagation Algorithm | |
| |
| |
| |
Setting the Parameter Values | |
| |
| |
| |
Initialization of weights | |
| |
| |
| |
Frequency of weight updates | |
| |
| |
| |
Choice of learning rate | |
| |
| |
| |
Momentum | |
| |
| |
| |
Generalizability | |
| |
| |
| |
Number of hidden layers and nodes | |
| |
| |
| |
Number of samples | |
| |
| |
| |
Theoretical Results* | |
| |
| |
| |
Cover's theorem | |
| |
| |
| |
Representations of functions | |
| |
| |
| |
Approximations of functions | |
| |
| |
| |
Accelerating the Learning Process | |
| |
| |
| |
Quickprop algorithm | |
| |
| |
| |
Conjugate gradient | |
| |
| |
| |
Applications | |
| |
| |
| |
Weaning from mechanically assisted ventilation | |
| |
| |
| |
Classification of myoelectric signals | |
| |
| |
| |
Forecasting commodity prices | |
| |
| |
| |
Controlling a gantry crane | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Supervised Learning: Mayer Networks II | |
| |
| |
| |
Madalines | |
| |
| |
| |
Adaptive Multilayer Networks | |
| |
| |
| |
Network pruning algorithms | |
| |
| |
| |
Marchand's algorithm | |
| |
| |
| |
Upstart algorithm | |
| |
| |
| |
Neural Tree | |
| |
| |
| |
Cascade correlation | |
| |
| |
| |
Tiling algorithm | |
| |
| |
| |
Prediction Networks | |
| |
| |
| |
Recurrent networks | |
| |
| |
| |
Feedforward networks for forecasting | |
| |
| |
| |
Radial Basis Functions | |
| |
| |
| |
Polynomial Networks | |
| |
| |
| |
Regularization | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Unsupervised Learning | |
| |
| |
| |
Winner-Take-All Networks | |
| |
| |
| |
Hamming networks | |
| |
| |
| |
Maxnet | |
| |
| |
| |
Simple competitive learning | |
| |
| |
| |
Learning Vector Quantizers | |
| |
| |
| |
Counterpropagation Networks | |
| |
| |
| |
Adaptive Resonance Theory | |
| |
| |
| |
Topologically Organized Networks | |
| |
| |
| |
Self-organizing maps | |
| |
| |
| |
Convergence* | |
| |
| |
| |
Extensions | |
| |
| |
| |
Distance-Based Learning | |
| |
| |
| |
Maximum entropy | |
| |
| |
| |
Neural gas | |
| |
| |
| |
Neocognitron | |
| |
| |
| |
Principal Component Analysis Networks | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Associative Models | |
| |
| |
| |
Non-iterative Procedures for Association | |
| |
| |
| |
Hopfield Networks | |
| |
| |
| |
Discrete Hopfield networks | |
| |
| |
| |
Storage capacity of Hopfield networks* | |
| |
| |
| |
Continuous Hopfield networks | |
| |
| |
| |
Brain-State-in-a-Box Network | |
| |
| |
| |
Boltzmann Machines | |
| |
| |
| |
Mean field annealing | |
| |
| |
| |
Hetero-associators | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
Optimization Methods | |
| |
| |
| |
Optimization using Hopfield Networks | |
| |
| |
| |
Traveling salesperson problem | |
| |
| |
| |
Solving simultaneous linear equations | |
| |
| |
| |
Allocating documents to multiprocessors | |
| |
| |
Discrete Hopfield network | |
| |
| |
Continuous Hopfield network | |
| |
| |
Performance | |
| |
| |
| |
Iterated Gradient Descent | |
| |
| |
| |
Simulated Annealing | |
| |
| |
| |
Random Search | |
| |
| |
| |
Evolutionary Computation | |
| |
| |
| |
Evolutionary algorithms | |
| |
| |
| |
Initialization | |
| |
| |
| |
Termination criterion | |
| |
| |
| |
Reproduction | |
| |
| |
| |
Operators | |
| |
| |
Mutation | |
| |
| |
Crossover | |
| |
| |
| |
Replacement | |
| |
| |
| |
Schema Theorem* | |
| |
| |
| |
Conclusion | |
| |
| |
| |
Exercises | |
| |
| |
| |
A Little Math | |
| |
| |
| |
Calculus | |
| |
| |
| |
Linear Algebra | |
| |
| |
| |
Statistics | |
| |
| |
| |
Data | |
| |
| |
| |
Iris Data | |
| |
| |
| |
Classification of Myoelectric Signals | |
| |
| |
| |
Gold Prices | |
| |
| |
| |
Clustering Animal Features | |
| |
| |
| |
3-D Corners, Grid and Approximation | |
| |
| |
| |
Eleven-City Traveling Salesperson Problem (Distances) | |
| |
| |
| |
Daily Stock Prices of Three Companies, over the Same Period | |
| |
| |
| |
Spiral Data | |
| |
| |
Bibliography | |
| |
| |
Index | |