Skip to content

Bayesian Networks An Introduction

Best in textbook rentals since 2012!

ISBN-10: 0470743042

ISBN-13: 9780470743041

Edition: 2009

Authors: Timo Koski, John Noble

Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Customers also bought

Book details

Copyright year: 2009
Publisher: John Wiley & Sons, Limited
Publication date: 9/25/2009
Binding: Hardcover
Pages: 368
Size: 6.95" wide x 10.00" long x 1.00" tall
Weight: 1.672
Language: English

John H. Noble was born in 1923 in Detroit, Michigan. He and his family lived in Germany during the Second World War, surviving the bombing of Dresden. In 1945, John was arrested by occupying Soviet forces and imprisoned in a former concentration camp for 5 years, before being sentenced to a further 15 years and transferred to the Soviet Gulag system. He was finally released in 1955, later writing 4 books about his experiences.

Preface
Graphical models and probabilistic reasoning
Introduction
Axioms of probability and basic notations
The Bayes update of probability
Inductive learning
Bayes' rule
Jeffrey's rule
Pearl's method of virtual evidence
Interpretations of probability and Bayesian networks
Learning as inference about parameters
Bayesian statistical inference
Tossing a thumb-tack
Multinomial sampling and the Dirichlet integral
Notes
Exercises: Probabilistic theories of causality, Bayes' rule, multinomial sampling and the Dirichlet density
Conditional independence, graphs and d-separation
Joint probabilities
Conditional independence
Directed acyclic graphs and d-separation
Graphs
Directed acyclic graphs and probability distributions
The Bayes ball
Illustrations
Potentials
Bayesian networks
Object oriented Bayesian networks
d-Separation and conditional independence
Markov models and Bayesian networks
I-maps and Markov equivalence
The trek and a distribution without a faithful graph
Notes
Exercises: Conditional independence and d-separation
Evidence, sufficiency and Monte Carlo methods
Hard evidence
Soft evidence and virtual evidence
Jeffrey's rule
Pearl's method of virtual evidence
Queries in probabilistic inference
The chest clinic problem
Bucket elimination
Bayesian sufficient statistics and prediction sufficiency
Bayesian sufficient statistics
Prediction sufficiency
Prediction sufficiency for a Bayesian network
Time variables
A brief introduction to Markov chain Monte Carlo methods
Simulating a Markov chain
Irreducibility, aperiodicity and time reversibility
The Metropolis-Hastings algorithm
The one-dimensional discrete Metropolis algorithm
Notes
Exercises: Evidence, sufficiency and Monte Carlo methods
Decomposable graphs and chain graphs
Definitions and notations
Decomposable graphs and triangulation of graphs
Junction trees
Markov equivalence
Markov equivalence, the essential graph and chain graphs
Notes
Exercises: Decomposable graphs and chain graphs
Learning the conditional probability potentials
Initial illustration: maximum likelihood estimate for a fork connection
The maximum likelihood estimator for multinomial sampling
MLE for the parameters in a DAG: the general setting
Updating, missing data, fractional updating
Notes
Exercises: Learning the conditional probability potentials
Learning the graph structure
Assigning a probability distribution to the graph structure
Markov equivalence and consistency
Establishing the DAG isomorphic property
Reducing the size of the search
The Chow-Liu tree
The Chow-Liu tree: A predictive approach
The K2 structural learning algorithm
The MMHC algorithm
Monte Carlo methods for locating the graph structure
Women in mathematics
Notes
Exercises: Learning the graph structure
Parameters and sensitivity
Changing parameters in a network
Measures of divergence between probability distributions
The Chan-Darwiche distance measure
Comparison with the Kullback-Leibler divergence and euclidean distance
Global bounds for queries
Applications to updating
Parameter changes to satisfy query constraints
Binary variables
The sensitivity of queries to parameter changes
Notes
Exercises: Parameters and sensitivity
Graphical models and exponential families
Introduction to exponential families
Standard examples of exponential families
Graphical models and exponential families
Noisy 'or' as an exponential family
Properties of the log partition function
Fenchel Legendre conjugate
Kullback-Leibler divergence
Mean field theory
Conditional Gaussian distributions
CG potentials
Some results on marginalization
CG regression
Notes
Exercises: Graphical models and exponential families
Causality and intervention calculus
Introduction
Conditioning by observation and by intervention
The intervention calculus for a Bayesian network
Establishing the model via a controlled experiment
Properties of intervention calculus
Transformations of probability
A note on the order of 'see' and 'do' conditioning
The 'Sure Thing' principle
Back door criterion, confounding and identifiability
Notes
Exercises: Causality and intervention calculus
The junction tree and probability updating
Probability updating using a junction tree
Potentials and the distributive law
Marginalization and the distributive law
Elimination and domain graphs
Factorization along an undirected graph
Factorizing along a junction tree
Flow of messages initial illustration
Local computation on junction trees
Schedules
Local and global consistency
Message passing for conditional Gaussian distributions
Using a junction tree with virtual evidence and soft evidence
Notes
Exercises: The junction tree and probability updating
Factor graphs and the sum product algorithm
Factorization and local potentials
Examples of factor graphs
The sum product algorithm
Detailed illustration of the algorithm
Notes
Exercise: Factor graphs and the sum product algorithm
References
Index