Skip to content

First Course in Bayesian Statistical Methods

Best in textbook rentals since 2012!

ISBN-10: 0387922997

ISBN-13: 9780387922997

Edition: 2009

Authors: Peter D. Hoff

Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!
Rent eBooks
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers with only a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized…    
Customers also bought

Book details

Copyright year: 2009
Publisher: Springer New York
Publication date: 6/15/2009
Binding: Hardcover
Pages: 271
Size: 6.10" wide x 9.25" long x 0.75" tall
Weight: 1.496
Language: English

Introduction and examples
Introduction
Why Bayes?
Estimating the probability of a rare event
Building a predictive model
Where we are going
Discussion and further references
Belief, probability and exchangeability
Belief functions and probabilities
Events, partitions and Bayes' rule
Independence
Random variables
Discrete random variables
Continuous random variables
Descriptions of distributions
Joint distributions
Independent random variables
Exchangeability
de Finetti's theorem
Discussion and further references
One-parameter models
The binomial model
Inference for exchangeable binary data
Confidence regions
The Poisson model
Posterior inference
Example: Birth rates
Exponential families and conjugate priors
Discussion and further references
Monte Carlo approximation
The Monte Carlo method
Posterior inference for arbitrary functions
Sampling from predictive distributions
Posterior predictive model checking
Discussion and further references
The normal model
The normal model
Inference for the mean, conditional on the variance
Joint inference for the mean and variance
Bias, variance and mean squared error
Prior specification based on expectations
The normal model for non-normal data
Discussion and further references
Posterior approximation with the Gibbs sampler
A semiconjugate prior distribution
Discrete approximations
Sampling from the conditional distributions
Gibbs sampling
General properties of the Gibbs sampler
Introduction to MCMC diagnostics
Discussion and further references
The multivariate normal model
The multivariate normal density
A semiconjugate prior distribution for the mean
The inverse-Wishart distribution
Gibbs sampling of the mean and covariance
Missing data and imputation
Discussion and further references
Group comparisons and hierarchical modeling
Comparing two groups
Comparing multiple groups
Exchangeability and hierarchical models
The hierarchical normal model
Posterior inference
Example: Math scores in U.S. public schools
Prior distributions and posterior approximation
Posterior summaries and shrinkage
Hierarchical modeling of means and variances
Analysis of math score data
Discussion and further references
Linear regression
The linear regression model
Least squares estimation for the oxygen uptake data
Bayesian estimation for a regression model
A semiconjugate prior distribution
Default and weakly informative prior distributions
Model selection
Bayesian model comparison
Gibbs sampling and model averaging
Discussion and further references
Nonconjugate priors and Metropolis-Hastings algorithms
Generalized linear models
The Metropolis algorithm
The Metropolis algorithm for Poisson regression
Metropolis, Metropolis-Hastings and Gibbs
The Metropolis-Hastings algorithm
Why does the Metropolis-Hastings algorithm work?
Combining the Metropolis and Gibbs algorithms
A regression model with correlated errors
Analysis of the ice core data
Discussion and further references
Linear and generalized linear mixed effects models
A hierarchical regression model
Full conditional distributions
Posterior analysis of the math score data
Generalized linear mixed effects models
A Metropolis-Gibbs algorithm for posterior approximation
Analysis of tumor location data
Discussion and further references
Latent variable methods for ordinal data
Ordered probit regression and the rank likelihood
Probit regression
Transformation models and the rank likelihood
The Gaussian copula model
Rank likelihood for copula estimation
Discussion and further references
Exercises
Common distributions
References
Index