Publisher description for The roots of backpropagation : from ordered derivatives to neural networks and political forecasting / Paul John Werbos.

Bibliographic record and links to related information available from the Library of Congress catalog

Information from electronic data provided by the publisher. May be incomplete or contain other coding.

Counter Now, for the first time, publication of the landmark work in backpropagation! Scientists, engineers, statisticians, operations researchers, and other investigators involved in neural networks have long sought direct access to Paul Werbos's groundbreaking, much-cited 1974 Harvard doctoral thesis, The Roots of Backpropagation, which laid the foundation of backpropagation. Now, with the publication of its full text, these practitioners can go straight to the original material and gain a deeper, practical understanding of this unique mathematical approach to social studies and related fields. In addition, Werbos has provided three more recent research papers, which were inspired by his original work, and a new guide to the field. Originally written for readers who lacked any knowledge of neural nets, The Roots of Backpropagation firmly established both its historical and continuing significance as it: Demonstrates the ongoing value and new potential of backpropagation Creates a wealth of sound mathematical tools useful across disciplines Sets the stage for the emerging area of fast automatic differentiation Describes new designs for forecasting and control which exploit backpropagation Unifies concepts from Freud, Jung, biologists, and others into a new mathematical picture of the human mind and how it works Certifies the viability of Deutsch's model of nationalism as a predictive tool-as well as the utility of extensions of this central paradigm"What a delight it was to see Paul Werbos rediscover Freud's version of ‘back-propagation.' Freud was adamant (in The Project for a Scientific Psychology) that selective learning could only take place if the presynaptic neuron was as influenced as is the postsynaptic neuron during excitation. Such activation of both sides of the contact barrier (Freud's name for the synapse) was accomplished by reducing synaptic resistance by the absorption of ‘energy' at the synaptic membranes. Not bad for 1895! But Werbos 1993 is even better." -Karl H. Pribram Professor Emeritus, Stanford University

Library of Congress subject headings for this publication: Neural networks (Computer science)Regression analysis, Prediction theory