Skip to content

Bayesian Data Analysis

Best in textbook rentals since 2012!

ISBN-10: 1439840954

ISBN-13: 9781439840955

Edition: 3rd 2013 (Revised)

Authors: Andrew Gelman, John B. Carlin, Hal S. Stern, Donald B. Rubin, David B. Dunson

List price: $95.00
Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!

Rental notice: supplementary materials (access codes, CDs, etc.) are not guaranteed with rental orders.

Rent eBooks
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

This third edition of a classic textbook presents a comprehensive introduction to Bayesian data analysis. Written for students and researchers alike, the text is written in an easily accessible manner with chapters that contain many exercises as well as detailed worked examples taken from various disciplines. This third edition provides two new chapters on Bayesian nonparametrics and covers computation systems BUGS and R. It also offers enhanced computing advice. The book#xE2;#xAC;"s website includes solutions to the problems, data sets, software advice, and other ancillary material.
Customers also bought

Book details

List price: $95.00
Edition: 3rd
Copyright year: 2013
Publisher: Taylor & Francis Group
Publication date: 11/5/2013
Binding: Hardcover
Pages: 675
Size: 7.50" wide x 10.25" long x 1.25" tall
Weight: 3.322
Language: English

Preface
Fundamentals of Bayesian Inference
Probability and inference
The three steps of Bayesian data analysis
General notation for statistical inference
Bayesian inference
Discrete probability examples: genetics and spell checking
Probability as a measure of uncertainty
Example of probability assignment: football point spreads
Example: estimating the accuracy of record linkage
Some useful results from probability theory
Computation and software
Bayesian inference in applied statistics
Bibliographic note
Exercises
Single-parameter models
Estimating a probability from binomial data
Posterior as compromise between data and prior information
Summarizing posterior inference
Informative prior distributions
Estimating a normal mean with known variance
Other standard single-parameter models
Example: informative prior distribution for cancer rates
Noninformative prior distributions
Weakly informative prior distributions
Bibliographic note
Exercises
Introduction to multiparameter models
Averaging over 'nuisance parameters'
Normal data with a noninformative prior distribution
Normal data with a conjugate prior distribution
Multinomial model for categorical data
Multivariate normal model with known variance
Multivariate normal with unknown mean and variance
Example: analysis of a bioassay experiment
Summary of elementary modeling and computation
Bibliographic note
Exercises
Asymptotics and connections to non-Bayesian approaches
Normal approximations to the posterior distribution
Large-sample theory
Counterexamples to the theorems
Frequency evaluations of Bayesian inferences
Bayesian interpretations of other statistical methods
Bibliographic note
Exercises
Hierarchical models
Constructing a parameterized prior distribution
Exchangeability and setting up hierarchical models
Fully Bayesian analysis of conjugate hierarchical models
Estimating exchangeable parameters from a normal model
Example: parallel experiments in eight schools
Hierarchical modeling applied to a meta-analysis
Weakly informative priors for hierarchical variance parameters
Bibliographic note
Exercises
Fundamentals of Bayesian Data Analysis
Model checking
The place of model checking in applied Bayesian statistics
Do the inferences from the model make sense?
Posterior predictive checking
Graphical posterior predictive checks
Model checking for the educational testing example
Bibliographic note
Exercises
Evaluating, comparing, and expanding models
Measures of predictive accuracy
Information criteria and cross-validation
Model comparison based on predictive performance
Model comparison using Bayes factors
Continuous model expansion
Implicit assumptions and model expansion: an example
Bibliographic note
Exercises
Modeling accounting for data collection
Bayesian inference requires a model for data collection
Data-collection models and ignoreability
Sample surveys
Designed experiments
Sensitivity and the role of randomization
Observational studies
Censoring and truncation
Discussion
Bibliographic note
Exercises
Decision analysis
Bayesian decision theory in different contexts
Using regression predictions: incentives for telephone surveys
Multistage decision making: medical screening
Hierarchical decision analysis for radon measurement
Personal vs. institutional decision analysis
Bibliographic note
Exercises
Advanced Computation
Introduction to Bayesian computation
Numerical integration
Distributional approximations
Direct simulation and rejection sampling
Importance sampling
How many simulation draws are needed?
Computing environments
Debugging Bayesian computing
Bibliographic note
Exercises
Basics of Markov chain simulation
Gibbs sampler
Metropolis and Metropolis-Hastings algorithms
Using Gibbs and Metropolis as building blocks
Inference and assessing convergence
Effective number of simulation draws
Example: hierarchical normal model
Bibliographic note
Exercises
Computationally efficient Markov chain simulation
Efficient Gibbs samplers
Efficient Metropolis jumping rules
Further extensions to Gibbs and Metropolis
Hamiltonian Monte Carlo
Hamiltonian dynamics for a simple hierarchical model
Stan: developing a computing environment
Bibliographic note
Exercises
Modal and distributional approximations
Finding posterior modes
Boundary-avoiding priors for modal summaries
Normal and related mixture approximations
Finding marginal posterior modes using EM
Approximating conditional and marginal posterior densities
Example: hierarchical normal model (continued)
Variational inference
Expectation propagation
Other approximations
Unknown normalizing factors
Bibliographic note
Exercises
Regression Models
Introduction to regression models
Conditional modeling
Bayesian analysis of the classical regression model
Regression for causal inference: incumbency in congressional elections
Goals of regression analysis
Assembling the matrix of explanatory variables
Regularization and dimension reduction for multiple predictors
Unequal variances and correlations
Including numerical prior information
Bibliographic note
Exercises
Hierarchical linear models
Regression coefficients exchangeable in batches
Example: forecasting U.S. presidential elections
Interpreting a normal prior distribution as additional data
Varying intercepts and slopes
Computation: batching and transformation
Analysis of variance and the batching of coefficients
Hierarchical models for batches of variance components
Bibliographic note
Exercises
Generalized linear models
Standard generalized linear model likelihoods
Working with generalized linear models
Weakly informative priors for logistic regression
Example: hierarchical Poisson regression for police stops
Example: hierarchical logistic regression for political opinions
Models for multivariate and multinomial responses
Loglinear models for multivariate discrete data
Bibliographic note
Exercises
Models for robust inference
Aspects of robustness
Overdispersed versions of standard probability models
Posterior inference and computation
Robust inference and sensitivity analysis for the eight schools
Robust regression using t-distributed errors
Bibliographic note
Exercises
Models for missing data
Notation
Multiple imputation
Missing data in the multivariate normal and t models
Example: multiple imputation for a series of polls
Missing values with counted data
Example: an opinion poll in Slovenia
Bibliographic note
Exercises
Nonlinear and Nonparametric Models
Parametric nonlinear models
Example: serial dilution assay
Example: population toxicokinetics
Bibliographic note
Exercises
Basis function models
Splines and weighted sums of basis functions
Basis selection and shrinkage of coefficients
Non-normal models and multivariate regression surfaces
Bibliographic note
Exercises
Gaussian process models
Gaussian process regression
Example: birthdays and birthdates
Latent Gaussian process models
Functional data analysis
Density estimation and regression
Bibliographic note
Exercises
Finite mixture models
Setting up and interpreting mixture models
Example: reaction times and schizophrenia
Label switching and posterior computation
Unspecified number of mixture components
Mixture models for classification and regression
Bibliographic note
Exercises
Dirichlet process models
Bayesian histograms
Dirichlet process prior distributions
Dirichlet process mixtures
Beyond density estimation
Hierarchical dependence
Density regression
Bibliographic note
Exercises
Standard probability distributions
Continuous distributions
Discrete distributions
Bibliographic note
Outline of proofs of limit theorems
Bibliographic note
Computation in R and Stan
Getting started with R and Stan
Fitting a hierarchical model in Stan
Direct simulation, Gibbs, and Metropolis in R
Programming Hamiltonian Monte Carlo in R
Further comments on computation
Bibliographic note
References
Author Index
Subject Index