Skip to content

Probability and Statistical Inference

Best in textbook rentals since 2012!

ISBN-10: 0824703790

ISBN-13: 9780824703790

Edition: 2000

Authors: Nitis Mukhopadhyay

List price: $150.00
Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

This gracefully organized text presents the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts. Beginning with the basic ideas and techniques of probability theory and progressing to more rigorous topics, this treatment covers all of the topics typically addressed in a two-semester course in probability and statistical inference for graduate and upper-level undergraduate courses, including hypothesis testing, Bayesian analysis, and sample-size determination. The author reinforces important ideas and special techniques with drills and…    
Customers also bought

Book details

List price: $150.00
Copyright year: 2000
Publisher: CRC Press LLC
Publication date: 3/22/2000
Binding: Hardcover
Pages: 665
Size: 6.25" wide x 9.25" long x 1.50" tall
Weight: 2.530
Language: English

Preface
Acknowledgments
Notions of Probability
Introduction
About Sets
Axiomatic Development of Probability
The Conditional Probability and Independent Events
Calculus of Probability
Bayes's Theorem
Selected Counting Rules
Discrete Random Variables
Probability Mass and Distribution Functions
Continuous Random Variables
Probability Density and Distribution Functions
The Median of a Distribution
Selected Reviews from Mathematics
Some Standard Probability Distributions
Discrete Distributions
Continuous Distributions
Exercises and Complements
Expectations of Functions of Random Variables
Introduction
Expectation and Variance
The Bernoulli Distribution
The Binomial Distribution
The Poisson Distribution
The Uniform Distribution
The Normal Distribution
The Laplace Distribution
The Gamma Distribution
The Moments and Moment Generating Function
The Binomial Distribution
The Poisson Distribution
The Normal Distribution
The Gamma Distribution
Determination of a Distribution via MGF
The Probability Generating Function
Exercises and Complements
Multivariate Random Variables
Introduction
Discrete Distributions
The Joint, Marginal and Conditional Distributions
The Multinomial Distribution
Continuous Distributions
The Joint, Marginal and Conditional Distributions
Three and Higher Dimensions
Covariances and Correlation Coefficients
The Multinomial Case
Independence of Random Variables
The Bivariate Normal Distribution
Correlation Coefficient and Independence
The Exponential Family of Distributions
One-parameter Situation
Multi-parameter Situation
Some Standard Probability Inequalities
Markov and Bernstein-Chernoff Inequalities
Tchebysheff's Inequality
Cauchy-Schwarz and Covariance Inequalities
Jensen's and Lyapunov's Inequalities
Holder's Inequality
Bonferroni Inequality
Central Absolute Moment Inequality
Exercises and Complements
Functions of Random Variables and Sampling Distribution
Introduction
Using Distribution Functions
Discrete Cases
Continuous Cases
The Order Statistics
The Convolution
The Sampling Distribution
Using the Moment Generating Function
A General Approach with Transformations
Several Variable Situations
Special Sampling Distributions
The Student's t Distribution
The F Distribution
The Beta Distribution
Special Continuous Multivariate Distributions
The Normal Distribution
The t Distribution
The F Distribution
Importance of Independence in Sampling Distributions
Reproductivity of Normal Distributions
Reproductivity of Chi-square Distributions
The Student's t Distribution
The F Distribution
Selected Review in Matrices and Vectors
Exercises and Complements
Concepts of Stochastic Convergence
Introduction
Convergence in Probability
Convergence in Distribution
Combination of the Modes of Convergence
The Central Limit Theorems
Convergence of Chi-square, t, and F Distributions
The Chi-square Distribution
The Student's t Distribution
The F Distribution
Convergence of the PDF and Percentage Points
Exercises and Complements
Sufficiency, Completeness, and Ancillarity
Introduction
Sufficiency
The Conditional Distribution Approach
The Neyman Factorization Theorem
Minimal Sufficiency
The Lehmann-Scheffe Approach
Information
One-parameter Situation
Multi-parameter Situation
Ancillarity
The Location, Scale, and Location-Scale Families
Its Role in the Recovery of Information
Completeness
Complete Sufficient Statistics
Basu's Theorem
Exercises and Complements
Point Estimation
Introduction
Finding Estimators
The Method of Moments
The Method of Maximum Likelihood
Criteria to Compare Estimators
Unbiasedness, Variance and Mean Squared Error
Best Unbiased and Linear Unbiased Estimators
Improved Unbiased Estimator via Sufficiency
The Rao-Blackwell Theorem
Uniformly Minimum Variance Unbiased Estimator
The Cramer-Rao Inequality and UMVUE
The Lehmann-Scheffe Theorems and UMVUE
A Generalization of the Cramer-Rao Inequality
Evaluation of Conditional Expectations
Unbiased Estimation Under Incompleteness
Does the Rao-Blackwell Theorem Lead to UMVUE?
Consistent Estimators
Exercises and Complements
Tests of Hypotheses
Introduction
Error Probabilities and the Power Function
The Concept of a Best Test
Simple Null Versus Simple Alternative Hypotheses
Most Powerful Test via the Neyman-Pearson Lemma
Applications: No Parameters Are Involved
Applications: Observations Are Non-IID
One-Sided Composite Alternative Hypothesis
UMP Test via the Neyman-Pearson Lemma
Monotone Likelihood Ratio Property
UMP Test via MLR Property
Simple Null Versus Two-Sided Alternative Hypotheses
An Example Where UMP Test Does Not Exist
An Example Where UMP Test Exists
Unbiased and UMP Unbiased Tests
Exercises and Complements
Confidence Interval Estimation
Introduction
One-Sample Problems
Inversion of a Test Procedure
The Pivotal Approach
The Interpretation of a Confidence Coefficient
Ideas of Accuracy Measures
Using Confidence Intervals in the Tests of Hypothesis
Two-Sample Problems
Comparing the Location Parameters
Comparing the Scale Parameters
Multiple Comparisons
Estimating a Multivariate Normal Mean Vector
Comparing the Means
Comparing the Variances
Exercises and Complements
Bayesian Methods
Introduction
Prior and Posterior Distributions
The Conjugate Priors
Point Estimation
Credible Intervals
Highest Posterior Density
Contrasting with the Confidence Intervals
Tests of Hypotheses
Examples with Non-Conjugate Priors
Exercises and Complements
Likelihood Ratio and Other Tests
Introduction
One-Sample Problems
LR Test for the Mean
LR Test for the Variance
Two-Sample Problems
Comparing the Means
Comparing the Variances
Bivariate Normal Observations
Comparing the Means: The Paired Difference t Method
LR Test for the Correlation Coefficient
Tests for the Variances
Exercises and Complements
Large-Sample Inference
Introduction
The Maximum Likelihood Estimation
Confidence Intervals and Tests of Hypothesis
The Distribution-Free Population Mean
The Binomial Proportion
The Poisson Mean
The Variance Stabilizing Transformations
The Binomial Proportion
The Poisson Mean
The Correlation Coefficient
Exercises and Complements
Sample Size Determination: Two-Stage Procedures
Introduction
The Fixed-Width Confidence Interval
Stein's Sampling Methodology
Some Interesting Properties
The Bounded Risk Point Estimation
The Sampling Methodology
Some Interesting Properties
Exercises and Complements
Appendix
Abbreviations and Notation
A Celebration of Statistics: Selected Biographical Notes
Selected Statistical Tables
The Standard Normal Distribution Function
Percentage Points of the Chi-Square Distribution
Percentage Points of the Student's t Distribution
Percentage Points of the F Distribution
References
Index