Jump to ratings and reviews
Rate this book

Adaptive Computation and Machine Learning

Learning in Graphical Models

Rate this book
Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering—uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms. This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters—Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

644 pages, Paperback

First published March 31, 1998

1 person is currently reading
38 people want to read

About the author

Michael I. Jordan

12 books4 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
3 (27%)
4 stars
5 (45%)
3 stars
3 (27%)
2 stars
0 (0%)
1 star
0 (0%)
No one has reviewed this book yet.

Can't find what you're looking for?

Get help and learn more about the design.