Publisher description for Information theory, inference, and learning algorithms / David J.C. MacKay.
Bibliographic record and links to related information available from the Library of Congress catalog
Information from electronic data provided by the publisher. May be incomplete or contain other coding.
This textbook offers comprehensive coverage of Shannon's theory of information as well as the theory of neural networks and probabilistic data modelling. It includes explanations of Shannon's important source encoding theorem and noisy channel theorem as well as descriptions of practical data compression systems. Many examples and exercises make the book ideal for students to use as a class textbook, or as a resource for researchers who need to work with neural networks or state-of-the-art error-correcting codes.
Library of Congress subject headings for this publication: Information theory