Skip to content

Adaptive Filter Theory

Best in textbook rentals since 2012!

ISBN-10: 013267145X

ISBN-13: 9780132671453

Edition: 5th 2014 (Revised)

Authors: Simon Haykin

List price: $244.20
Blue ribbon 30 day, 100% satisfaction guarantee!
Rent eBooks
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Adaptive Filter Theory, 5e, is ideal for courses in Adaptive Filters. Haykin examines both the mathematical theory behind various linear adaptive filters and the elements of supervised multilayer perceptrons. In its fifth edition, this highly successful book has been updated and refined to stay current with the field and develop concepts in as unified and accessible a manner as possible.
Customers also bought

Book details

List price: $244.20
Edition: 5th
Copyright year: 2014
Publisher: Pearson Education
Publication date: 5/23/2013
Binding: Hardcover
Pages: 912
Size: 7.25" wide x 9.25" long x 1.00" tall
Weight: 2.772

Preface
Acknowledgments
Background and Preview
The Filtering Problem
Linear Optimum Filters
Adaptive Filters
Linear Filter Structures
Approaches to the Development of Linear Adaptive Filters
Adaptive Beamforming
Four Classes of Applications
Historical Notes
Stochastic Processes and Models
Partial Characterization of a Discrete-Time Stochastic Process
Mean Ergodic Theorem
Correlation Matrix
Correlation Matrix of Sine Wave Plus Noise
Stochastic Models
Wold Decomposition
Asymptotic Stationarity of an Autoregressive Process
Yule-Walker Equations
Computer Experiment: Autoregressive Process of Order Two
Selecting the Model Order
Complex Gaussian Processes
Power Spectral Density
Properties of Power Spectral Density
Transmission of a Stationary Process Through a Linear Filter
Cram�r Spectral Representation for a Stationary Process
Power Spectrum Estimation
Other Statistical Characteristics of a Stochastic Process
Polyspectra
Spectral-Correlation Density
Summary and Discussion
Problems
Wiener Filters
Linear Optimum Filtering: Statement of the Problem
Principle of Orthogonality
Minimum Mean-Square Error
Wiener-Hopf Equations
Error-Performance Surface
Multiple Linear Regression Model
Example
Linearly Constrained Minimum-Variance Filter
Generalized Sidelobe Cancellers
Summary and Discussion
Problems
Linear Prediction
Forward Linear Prediction
Backward Linear Prediction
Levinson-Durbin Algorithm
Properties of Prediction-Error Filters
Schur-Cohn Test
Autoregressive Modeling of a Stationary Stochastic Process
Cholesky Factorization
Lattice Predictors
All-Pole, All-Pass Lattice Filter
Joint-Process Estimation
Predictive Modeling of Speech
Summary and Discussion
Problems
Method of Steepest Descent
Basic Idea of the Steepest-Descent Algorithm
The Steepest-Descent Algorithm Applied to the Wiener Filter
Stability of the Steepest-Descent Algorithm
Example
The Steepest-Descent Algorithm Viewed as a Deterministic Search Method
Virtue and Limitation of the Steepest-Descent Algorithm
Summary and Discussion
Problems
Method of Stochastic Gradient Descent
Principles of Stochastic Gradient Descent
Application 1: Least-Mean-Square (LMS) Algorithm
Application 2: Gradient-Adaptive Lattice Filtering Algorithm
Other Applications of Stochastic Gradient Descent
Summary and Discussion
Problems
The Least-Mean-Square (LMS) Algorithm
Signal-How Graph
Optimality Considerations
Applications
Statistical Learning Theory
Transient Behavior and Convergence Considerations
Efficiency
Computer Experiment on Adaptive Prediction
Computer Experiment on Adaptive Equalization
Computer Experiment on a Minimum-Variance Distortionless-Response Beamformer
Summary and Discussion
Problems
Normalized Least-Mean-Square (LMS) Algorithm and Its Generalization
Normalized LMS Algorithm: The Solution to a Constrained Optimization Problem
Stability of the Normalized LMS Algorithm
Step-Size Control for Acoustic Echo Cancellation
Geometric Considerations Pertaining to the Convergence Process for Real-Valued Data
Affine Projection Adaptive Filters
Summary and Discussion
Problems
Block-Adaptive Filters
Block-Adaptive Filters: Basic Ideas
Fast Block LMS Algorithm
Unconstrained Frequency-Domain Adaptive Filters
Self-Orthogonalizing Adaptive Filters
Computer Experiment on Adaptive Equalization
Subband Adaptive Filters
Summary and Discussion
Problems
Method of Least-Squares
Statement of the Linear Least-Squares Estimation Problem
Data Windowing
Principle of Orthogonality Revisited
Minimum Sum of Error Squares
Normal Equations and Linear Least-Squares Filters
Time-Average Correlation Matrix �
Reformulation of the Normal Equations in Terms of Data Matrices
Properties of Least-Squares Estimates
Minimum-Variance Distortionless Response (MVDR) Spectrum Estimation
Regularized MVDR Beamforming
Singular-Value Decomposition
Pseudoinverse
Interpretation of Singular Values and Singular Vectors
Minimum-Norm Solution to the Linear Least-Squares Problem
Normalized LMS Algorithm Viewed as the Minimum-Norm Solution to an Underdetermined Least-Squares Estimation Problem
Summary and Discussion
Problems
The Recursive Least-Squares (RLS) Algorithm
Some Preliminaries
The Matrix Inversion Lemma
The Exponentially Weighted RLS Algorithm
Selection of the Regularization Parameter
Updated Recursion for the Sum of Weighted Error Squares
Example: Single-Weight Adaptive Noise Canceller
Statistical Learning Theory
Efficiency
Computer Experiment on Adaptive Equalization
Summary and Discussion
Problems
Robustness
Robustness, Adaptation, and Disturbances
Robustness: Preliminary Considerations Rooted in H<sup>&#8734;</sup> Optimization
Robustness of the LMS Algorithm
Robustness of the RLS Algorithm
Comparative Evaluations of the LMS and RLS Algorithms from the Perspective of Robustness
Risk-Sensitive Optimality
Trade-Offs Between Robustness and Efficiency
Summary and Discussion
Problems
Finite-Precision Effects
Quantization Errors
Least-Mean-Square (LMS) Algorithm
Recursive Least-Squares (RLS) Algorithm
Summary and Discussion
Problems
Adaptation in Nonstationary Environments
Causes and Consequences of Nonstationarity
The System Identification Problem
Degree of Nonstationarity
Criteria for Tracking Assessment
Tracking Performance of the LMS Algorithm
Tracking Performance of the RLS Algorithm
Comparison of the Tracking Performance of LMS and RLS Algorithms
Tuning of Adaptation Parameters
Incremental Delta-Bar-Delta (IDBD) Algorithm
Autostep Method
Computer Experiment: Mixture of Stationary and Nonstationary Environmental Data
Summary and Discussion
Problems
Kalman Filters
Recursive Minimum Mean-Square Estimation for Scalar Random Variables
Statement of the Kalman Filtering Problem
The Innovations Process
Estimation of the State Using the Innovations Process
Filtering
Initial Conditions
Summary of the Kalman Filter
Optimality Criteria for Kalman Filtering
Kalman Filter as the Unifying Basis for RLS Algorithms
Covariance Filtering Algorithm
Information Filtering Algorithm
Summary and Discussion
Problems
Square-Root Adaptive Filtering Algorithms
Square-Root Kalman Filters
Building Square-Root Adaptive Filters on the Two Kalman Filter Variants
QRD-RLS Algorithm
Adaptive Beamforming
Inverse QRD-RLS Algorithm
Finite-Precision Effects
Summary and Discussion
Problems
Order-Recursive Adaptive Filtering Algorithm
Order-Recursive Adaptive Filters Using Least-Squares Estimation: An Overview
Adaptive Forward Linear Prediction
Adaptive Backward Linear Prediction
Conversion Factor
Least-Squares Lattice (LSL) Predictor
Angle-Normalized Estimation Errors
First-Order State-Space Models for Lattice Filtering
QR-Decomposition-Based Least-Squares Lattice (QRD-LSL) Filters
Fundamental Properties of the QRD-LSL Filter
Computer Experiment on Adaptive Equalization
Recursive (LSL) Filters Using A Posteriori Estimation Errors
Recursive LSL Filters Using A Priori Estimation Errors with Error Feedback
Relation Between Recursive LSL and RLS Algorithms
Finite-Precision Effects
Summary and Discussion
Problems
Blind Deconvolution
Overview of Blind Deconvolution
Channel Identifiability Using Cyclostationary Statistics
Subspace Decomposition for Fractionally Spaced Blind Identification
Bussgang Algorithm for Blind Equalization
Extension of the Bussgang Algorithm to Complex Baseband Channels
Special Cases of the Bussgang Algorithm
Fractionally Spaced Bussgang Equalizers
Estimation of Unknown Probability Distribution Function of Signal Source
Summary and Discussion
Problems
Epilogue
Robustness, Efficiency, and Complexity
Kernel-Based Nonlinear Adaptive Filtering
Theory of Complex Variables
Cauchy-Riemann Equations
Cauchy's Integral Formula
Laurent's Series
Singularities and Residues
Cauchy's Residue Theorem
Principle of the Argument
Inversion Integral for the z-Transform
Parseval's Theorem
Wirtinger Calculus for Computing Complex Gradients
Wirtinger Calculus: Scalar Gradients
Generalized Wirtinger Calculus: Gradient Vectors
Another Approach to Compute Gradient Vectors
Expressions for the Partial Derivatives &#8706;f/&#8706;z and &#8706;f/&#8706;z*
Method of Lagrange Multipliers
Optimization Involving a Single Equality Constraint
Optimization Involving Multiple Equality Constraints
Optimum Beamformer
Estimation Theory
Likelihood Function
Cram�r-Rao Inequality
Properties of Maximum-Likelihood Estimators
Conditional Mean Estimator
Eigenanalysis
The Eigenvalue Problem
Properties of Eigenvalues and Eigenvectors
Low-Rank Modeling
Eigenfilters
Eigenvalue Computations
Langevin Equation of Nonequilibrium Thermodynamics
Brownian Motion
Langevin Equation
Rotations and Reflections
Plane Rotations
Two-Sided Jacobi Algorithm
Cyclic Jacobi Algorithm
Householder Transformation
The QR Algorithm
Complex Wishart Distribution
Definition
The Chi-Square Distribution as a Special Case
Properties of the Complex Wishart Distribution
Expectation of the Inverse Correlation Matrix �<sup>-1</sup>(n)
Glossary
Text Conventions
Abbreviations
Principal Symbols
Bibliography
Suggested Reading
Index