Skip to content

Introduction to Bayesian Statistics

Best in textbook rentals since 2012!

ISBN-10: 0471270202

ISBN-13: 9780471270201

Edition: 2004

Authors: William M. Bolstad

List price: $115.00
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. In Bayesian statistics the rules of probability are used to make inferences about the parameter. Prior information about the parameter and sample information from the data are combined using Bayes theorem. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. This book uniquely covers the topics usually found in a typical introductory statistics book but from a Bayesian perspective.
Customers also bought

Book details

List price: $115.00
Copyright year: 2004
Publisher: John Wiley & Sons, Incorporated
Publication date: 4/26/2004
Binding: Hardcover
Pages: 376
Size: 6.25" wide x 9.25" long x 0.75" tall
Weight: 1.430
Language: English

William M. Bolstad, PhD, is Senior Lecturer in the Department of Statistics at The University of Waikato, New Zealand. He holds degrees from the University of Missouri, Stanford University, and The University of Waikato. Dr. Bolstad's research interests include Bayesian statistics, MCMC methods, recursive estimation techniques, multiprocess dynamic time series models, and forecasting.

Preface
Introduction to Bayesian Statistics
The Frequentist Approach to Statistics
The Bayesian Approach to Statistics
Comparing Likelihood and Bayesian Approaches to Statistics
Computational Bayesian Statistics
Purpose and Organization of This Book
Monte Carlo Sampling from the Posterior
Acceptance-Rejection-Sampling
Sampling-Importance-Resampling
Adaptive-Rejection-Sampling from a Log-Concave Distribution
Why Direct Methods Are Inefficient for High-Dimension Parameter Space
Bayesian Inference
Bayesian Inference from the Numerical Posterior
Bayesian Inference from Posterior Random Sample
Bayesian Statistics Using Conjugate Priors
One-Dimensional Exponential Family of Densities
Distributions for Count Data
Distributions for Waiting Times
Normally Distributed Observations with Known Variance
Normally Distributed Observations with Known Mean
Normally Distributed Observations with Unknown Mean and Variance
Multivariate Normal Observations with Known Covariance Matrix
Observations from Normal Linear Regression Model
Appendix: Proof of Poisson Process Theorem
Markov Chains
Stochastic Processes
Markov Chains
Time-Invariant Markov Chains with Finite State Space
Classification of States of a Markov Chain
Sampling from a Markov Chain
Time-Reversible Markov Chains and Detailed-Balance
Markov Chains with Continuous State Space
Markov Chain Monte Carlo Sampling from Posterior
Metropolis-Hastings Algorithm for a Single Parameter
Metropolis-Hastings Algorithm for Multiple Parameters
Blockwise Metropolis Hastings Algorithm
Gibbs Sampling
Summary
Statistical Inference from a Markov Chain Monte Carlo Sample
Mixing Properties of the Chain
Finding a Heavy-Tailed Matched Curvature Candidate Density
Obtaining An Approximate Random Sample For Inference
Appendix: Procedure for Finding the Matched Curvature Candidate Density for a Multivariate Parameter
Logistic Regression
Logistic Regression Model
Computational Bayesian Approach to the Logistic Regression Model
Modelling with the Multiple Logistic Regression Model
Poisson Regression and Proportional Hazards Model
Poisson Regression Model
Computational Approach to Poisson Regression Model
The Proportional Hazards Model
Computational Bayesian Approach to Proportional Hazards Model
Gibbs Sampling and Hierarchical Models
Gibbs Sampling Procedure
The Gibbs Sampler for the Normal Distribution
Hierarchical Models and Gibbs Sampling
Modelling Related Populations with Hierarchical Models
Appendix: Proof That Improper Jeffrey's Prior Distribution for the Hypervariance Can Lead to an Improper Posterior
Going Forward with Markov Chain Monte Carlo
Using the Included Minitab Macros
Using the Included R Functions
References
Topic Index