First Course in Bayesian Statistical Methods

ISBN-10: 0387922997
ISBN-13: 9780387922997
Edition: 2009
Authors: Peter D. Hoff
Buy it from $72.91
This item qualifies for FREE shipping

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy

Description: This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers with only a basic familiarity with probability, yet allows more advanced readers to quickly  More...

New Starting from $72.91
what's this?
Rush Rewards U
Members Receive:
coins
coins
You have reached 400 XP and carrot coins. That is the daily max!
You could win $10,000

Get an entry for every item you buy, rent, or sell.

Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Add to cart
Study Briefs
History of Western Art Online content $4.95 $1.99
Add to cart
Study Briefs
History of World Philosophies Online content $4.95 $1.99
Add to cart
Study Briefs
American History Volume 1 Online content $4.95 $1.99
Add to cart
Study Briefs
History of Western Music Online content $4.95 $1.99

Customers also bought

Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading

Book details

Copyright year: 2009
Publisher: Springer London, Limited
Publication date: 7/14/2009
Binding: Hardcover
Pages: 272
Size: 6.50" wide x 9.75" long x 0.75" tall
Weight: 1.276

This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers with only a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice.Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models, and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book.

Introduction and examples
Introduction
Why Bayes?
Estimating the probability of a rare event
Building a predictive model
Where we are going
Discussion and further references
Belief, probability and exchangeability
Belief functions and probabilities
Events, partitions and Bayes' rule
Independence
Random variables
Discrete random variables
Continuous random variables
Descriptions of distributions
Joint distributions
Independent random variables
Exchangeability
de Finetti's theorem
Discussion and further references
One-parameter models
The binomial model
Inference for exchangeable binary data
Confidence regions
The Poisson model
Posterior inference
Example: Birth rates
Exponential families and conjugate priors
Discussion and further references
Monte Carlo approximation
The Monte Carlo method
Posterior inference for arbitrary functions
Sampling from predictive distributions
Posterior predictive model checking
Discussion and further references
The normal model
The normal model
Inference for the mean, conditional on the variance
Joint inference for the mean and variance
Bias, variance and mean squared error
Prior specification based on expectations
The normal model for non-normal data
Discussion and further references
Posterior approximation with the Gibbs sampler
A semiconjugate prior distribution
Discrete approximations
Sampling from the conditional distributions
Gibbs sampling
General properties of the Gibbs sampler
Introduction to MCMC diagnostics
Discussion and further references
The multivariate normal model
The multivariate normal density
A semiconjugate prior distribution for the mean
The inverse-Wishart distribution
Gibbs sampling of the mean and covariance
Missing data and imputation
Discussion and further references
Group comparisons and hierarchical modeling
Comparing two groups
Comparing multiple groups
Exchangeability and hierarchical models
The hierarchical normal model
Posterior inference
Example: Math scores in U.S. public schools
Prior distributions and posterior approximation
Posterior summaries and shrinkage
Hierarchical modeling of means and variances
Analysis of math score data
Discussion and further references
Linear regression
The linear regression model
Least squares estimation for the oxygen uptake data
Bayesian estimation for a regression model
A semiconjugate prior distribution
Default and weakly informative prior distributions
Model selection
Bayesian model comparison
Gibbs sampling and model averaging
Discussion and further references
Nonconjugate priors and Metropolis-Hastings algorithms
Generalized linear models
The Metropolis algorithm
The Metropolis algorithm for Poisson regression
Metropolis, Metropolis-Hastings and Gibbs
The Metropolis-Hastings algorithm
Why does the Metropolis-Hastings algorithm work?
Combining the Metropolis and Gibbs algorithms
A regression model with correlated errors
Analysis of the ice core data
Discussion and further references
Linear and generalized linear mixed effects models
A hierarchical regression model
Full conditional distributions
Posterior analysis of the math score data
Generalized linear mixed effects models
A Metropolis-Gibbs algorithm for posterior approximation
Analysis of tumor location data
Discussion and further references
Latent variable methods for ordinal data
Ordered probit regression and the rank likelihood
Probit regression
Transformation models and the rank likelihood
The Gaussian copula model
Rank likelihood for copula estimation
Discussion and further references
Exercises
Common distributions
References
Index

×
Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.

×