Skip to content

Elements of Artificial Neural Networks

ISBN-10: 0262133288

ISBN-13: 9780262133289

Edition: 1997

Authors: Kishan Mehrotra, Chilukuri K. Mohan, Sanjay Ranka

List price: $75.00
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

Elements of Artificial Neural Networks provides a clearly organized general introduction, focusing on a broad range of algorithms, for students and others who want to use neural networks rather than simply study them. The authors, who have been developing and team teaching the material in a one-semester course over the past six years, describe most of the basic neural network models (with several detailed solved examples) and discuss the rationale and advantages of the models, as well as their limitations. The approach is practical and open-minded and requires very little mathematical or technical background. Written from a computer science and statistics point of view, the text stresses links to contiguous fields and can easily serve as a first course for students in economics and management. The opening chapter sets the stage, presenting the basic concepts in a clear and objective way and tackling important -- yet rarely addressed -- questions related to the use of neural networks in practical situations. Subsequent chapters on supervised learning (single layer and multilayer networks), unsupervised learning, and associative models are structured around classes of problems to which networks can be applied. Applications are discussed along with the algorithms. A separate chapter takes up optimization methods. The most frequently used algorithms, such as backpropagation, are introduced early on, right after perceptrons, so that these can form the basis for initiating course projects. Algorithms published as late as 1995 are also included. All of the algorithms are presented using block-structured pseudo-code, and exercises are provided throughout. Software implementing many commonly used neural network algorithms is available at the book's website. Transparency masters, including abbreviated text and figures for the entire book, are available for instructors using the text.
Customers also bought

Book details

List price: $75.00
Copyright year: 1997
Publisher: MIT Press
Publication date: 10/11/1996
Binding: Hardcover
Pages: 400
Size: 7.25" wide x 9.50" long x 1.00" tall
Weight: 2.046
Language: English

Preface
Introduction
History of Neural Networks
Structure and Function of a Single Neuron
Biological neurons
Artificial neuron models
Neural Net Architectures
Fully connected networks
Layered networks
Acyclic networks
Feedforward networks
Modular neural networks
Neural Learning
Correlation learning
Competitive learning
Feedback-based weight adaptation
What Can Neural Networks Be Used for?
Classification
Clustering
Vector quantization
Pattern association
Function approximation
Forecasting
Control applications
Optimization
Search
Evaluation of Networks
Quality of results
Generalizability
Computational resources
Implementation
Conclusion
Exercises
Supervised Learning: Single-Layer Networks
Perceptrons
Linear Separability
Perceptron Training Algorithm
Termination criterion
Choice of learning rate
Non-numeric inputs
Guarantee of Success
Modifications
Pocket algorithm
Adalines
Multiclass discrimination
Conclusion
Exercises
Supervised Learning: Multilayer Networks I
Multilevel Discrimination
Preliminaries
Architecture
Objectives
Backpropagation Algorithm
Setting the Parameter Values
Initialization of weights
Frequency of weight updates
Choice of learning rate
Momentum
Generalizability
Number of hidden layers and nodes
Number of samples
Theoretical Results*
Cover's theorem
Representations of functions
Approximations of functions
Accelerating the Learning Process
Quickprop algorithm
Conjugate gradient
Applications
Weaning from mechanically assisted ventilation
Classification of myoelectric signals
Forecasting commodity prices
Controlling a gantry crane
Conclusion
Exercises
Supervised Learning: Mayer Networks II
Madalines
Adaptive Multilayer Networks
Network pruning algorithms
Marchand's algorithm
Upstart algorithm
Neural Tree
Cascade correlation
Tiling algorithm
Prediction Networks
Recurrent networks
Feedforward networks for forecasting
Radial Basis Functions
Polynomial Networks
Regularization
Conclusion
Exercises
Unsupervised Learning
Winner-Take-All Networks
Hamming networks
Maxnet
Simple competitive learning
Learning Vector Quantizers
Counterpropagation Networks
Adaptive Resonance Theory
Topologically Organized Networks
Self-organizing maps
Convergence*
Extensions
Distance-Based Learning
Maximum entropy
Neural gas
Neocognitron
Principal Component Analysis Networks
Conclusion
Exercises
Associative Models
Non-iterative Procedures for Association
Hopfield Networks
Discrete Hopfield networks
Storage capacity of Hopfield networks*
Continuous Hopfield networks
Brain-State-in-a-Box Network
Boltzmann Machines
Mean field annealing
Hetero-associators
Conclusion
Exercises
Optimization Methods
Optimization using Hopfield Networks
Traveling salesperson problem
Solving simultaneous linear equations
Allocating documents to multiprocessors
Discrete Hopfield network
Continuous Hopfield network
Performance
Iterated Gradient Descent
Simulated Annealing
Random Search
Evolutionary Computation
Evolutionary algorithms
Initialization
Termination criterion
Reproduction
Operators
Mutation
Crossover
Replacement
Schema Theorem*
Conclusion
Exercises
A Little Math
Calculus
Linear Algebra
Statistics
Data
Iris Data
Classification of Myoelectric Signals
Gold Prices
Clustering Animal Features
3-D Corners, Grid and Approximation
Eleven-City Traveling Salesperson Problem (Distances)
Daily Stock Prices of Three Companies, over the Same Period
Spiral Data
Bibliography
Index