Table of contents for Neural networks and learning machines / Simon Haykin.

Bibliographic record and links to related information available from the Library of Congress catalog.

Note: Contents data are machine generated based on pre-publication provided by the publisher. Contents may have variations from the printed book or be incomplete or contain other coding.


Counter
Contents
Preface x
Background and Preview 000
	1.	The Filtering Problem 000
	2.	Linear Optimum Filters 000
	3.	Adaptive Filters 000
	4.	Linear Filter Structures 000
	5.	Approaches to the Development of Linear Adaptive Filters 000
	6.	Adaptive Beamforming 000
	7.	Four Classes of Applications 000
	8.	Historical Notes 000
Chapter 1 Stochastic Processes and Models 000
	1.1	Partial Characterization of a Discrete-Time Stochastic Process 000
	1.2	Mean Ergodic Theorem 000
	1.3	Correlation Matrix 000
	1.4	Correlation Matrix of Sine Wave Plus Noise 000
	1.5	Stochastic Models 000
	1.6	Wold Decomposition 000
	1.7	Asymptotic Stationarity of an Autoregressive Process 000
	1.8	Yule-Walker Equations 000
	1.9	Computer Experiment: Autoregressive Process of Order Two 000
	1.10	Selecting the Model Order 000
	1.11	Complex Gaussian Processes 000
	1.12	Power Spectral Density 000
	1.13	Properties of Power Spectral Density 000
	1.14	Transmission of a Stationary Process Through a Linear Filter 000
	1.15	Cram¿r Spectral Representation for a Stationary Process 000
	1.16	Power Spectrum Estimation 000
	1.17	Other Statistical Characteristics of a Stochastic Process 000
	1.18	Polyspectra 000
	1.19	Spectral-Correlation Density 000
	1.20	Summary 000
		Problems 000
Chapter 10 Kalman Filters 000
	10.1	Recursive Minimum Mean-Square Estimation for Scalar Random Variables 000
	10.2	Statement of the Kalman Filtering Problem 000
	10.3	The Innovations Process 000
	10.4	Estimation of the State Using the Innovations Process 000
	10.5	Filtering 000
	10.6	Initial Conditions 000
	10.7	Summary of the Kalman Filter 000
	10.8	Kalman Filter as the Unifying Basis for RLS Filters 000
	10.9	Variants of the Kalman Filter 000
	10.10	The Extended Kalman Filter 000
	10.11	Summary 000
		Problems 000
Appendix A Complex Variables 000
	A.1	Cauchy-Reimann Equations 000
	A.2	Cauchy's Integral Formula 000
	A.3	Laurent's Series 000
	A.4	Singularities and Residues 000
	A.5	Cauchy's Residue Theorem 000
	A.6	Principle of the Argument 000
	A.7	Inversion Integral for the z-Transform 000
	A.8	Parseval's Theorem 000
Appendix B Differentiation with Respect to a Vector 000
	B.1	Basic Definitions 000
	B.2	Examples 000
	B.3	Relation Between the Derivative with Respect to a Vector and the Gradient Vector 000
Appendix C Complex Wishart Distribution 000
	G.1	Definition 000
	G.2	The Chi-Square Distribution as a Special Case 000
	G.3	Properties of the Complex Wishart Distribution 000
	G.4	Expectation of the Inverse Correlation Matrix F21(n) 000
Glossary 000
Bibliography 000
Index 000

Library of Congress Subject Headings for this publication:

Neural networks (Computer science).
Adaptive filters.