Skip to content

Introduction to Linear Regression Analysis

Best in textbook rentals since 2012!

ISBN-10: 0471754951

ISBN-13: 9780471754954

Edition: 4th 2006 (Revised)

Authors: Douglas C. Montgomery, Elizabeth A. Peck, G. Geoffrey Vining

List price: $150.00
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

A comprehensive and up-to-date introduction to the fundamentals of regression analysis The Fourth Edition of Introduction to Linear Regression Analysis describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research. This popular book blends both theory and application to equip the reader with an understanding of the basic principles necessary to apply regression model-building techniques in a wide variety of application environments. It assumes a working knowledge of basic statistics and a familiarity with hypothesis testing and confidence intervals, as well as the normal, t, x2, and F distributions. …    
Customers also bought

Book details

List price: $150.00
Edition: 4th
Copyright year: 2006
Publisher: John Wiley & Sons, Incorporated
Publication date: 7/21/2006
Binding: Hardcover
Pages: 640
Size: 7.25" wide x 10.25" long x 1.25" tall
Weight: 2.948
Language: English

Preface
Introduction
Regression and Model Building
Data Collection
Uses of Regression
Role of the Computer
Simple Linear Regression
Simple Linear Regression Model
Least-Squares Estimation of the Parameters
Estimation of [beta subscript 0] and [beta subscript 1]
Properties of the Least-Squares Estimators and the Fitted Regression Model
Estimation of [sigma superscript 2]
Alternate Form of the Model
Hypothesis Testing on the Slope and Intercept
Use of t Tests
Testing Significance of Regression
Analysis of Variance
Interval Estimation in Simple Linear Regression
Confidence Intervals on [beta superscript 0], [beta superscript 1], and [sigma subscript 2]
Interval Estimation of the Mean Response
Prediction of New Observations
Coefficient of Determination
Using SAS for Simple Linear Regression
Some Considerations in the Use of Regression
Regression Through the Origin
Estimation by Maximum Likelihood
Case Where the Regressor x is Random
x and y Jointly Distributed
x and y Jointly Normally Distributed: Correlation Model
Problems
Multiple Linear Regression
Multiple Regression Models
Estimation of the Model Parameters
Least-Squares Estimation of the Regression Coefficients
Geometrical Interpretation of Least Squares
Properties of the Least-Squares Estimators
Estimation of [sigma superscript 2]
Inadequacy of Scatter Diagrams in Multiple Regression
Maximum-Likelihood Estimation
Hypothesis Testing in Multiple Linear Regression
Test for Significance of Regression
Tests on Individual Regression Coefficients
Special Case of Orthogonal Columns in X
Testing the General Linear Hypothesis
Confidence Intervals in Multiple Regression
Confidence Intervals on the Regression Coefficients
Confidence Interval Estimation of the Mean Response
Simultaneous Confidence Intervals on Regression Coefficients
Prediction of New Observations
Using SAS for Basic Multiple Linear Regression
Hidden Extrapolation in Multiple Regression
Standardized Regression Coefficients
Multicollinearity
Why Do Regression Coefficients Have the Wrong Sign?
Problems
Model Adequacy Checking
Introduction
Residual Analysis
Definition of Residuals
Methods for Scaling Residuals
Residual Plots
Partial Regression and Partial Residual Plots
Using MINITAB and SAS for Residual Analysis
Other Residual Plotting and Analysis Methods
PRESS Statistic
Detection and Treatment of Outliers
Lack of Fit of the Regression Model
Formal Test for Lack of Fit
Estimation of Pure Error from Near Neighbors
Problems
Transformations and Weighting to Correct Model Inadequacies
Introduction
Variance-Stabilizing Transformations
Transformations to Linearize the Model
Analytical Methods for Selecting a Transformation
Transformations on y: The Box-Cox Method
Transformations on the Regressor Variables
Generalized and Weighted Least Squares
Generalized Least Squares
Weighted Least Squares
Some Practical Issues
Problems
Diagnostics for Leverage and Influence
Importance of Detecting Influential Observations
Leverage
Measures of Influence: Cook's D
Measures of Influence: DFFITS and DFBETAS
A Measure of Model Performance
Detecting Groups of Influential Observations
Treatment of Influential Observations
Problems
Polynomial Regression Models
Introduction
Polynomial Models in One Variable
Basic Principles
Piecewise Polynomial Fitting (Splines)
Polynomial and Trigonometric Terms
Nonparametric Regression
Kernel Regression
Locally Weighted Regression (Loess)
Final Cautions
Polynomial Models in Two or More Variables
Orthogonal Polynomials
Problems
Indicator Variables
General Concept of Indicator Variables
Comments on the Use of Indicator Variables
Indicator Variables versus Regression on Allocated Codes
Indicator Variables as a Substitute for a Quantitative Regressor
Regression Approach to Analysis of Variance
Problems
Variable Selection and Model Building
Introduction
Model-Building Problem
Consequences of Model Misspecification
Criteria for Evaluating Subset Regression Models
Computational Techniques for Variable Selection
All Possible Regressions
Stepwise Regression Methods
Strategy for Variable Selection and Model Building
Case Study: Gorman and Toman Asphalt Data Using SAS
Problems
Validation of Regression Models
Introduction
Validation Techniques
Analysis of Model Coefficients and Predicted Values
Collecting Fresh Data-Confirmation Runs
Data Splitting
Data from Planned Experiments
Problems
Multicollinearity
Introduction
Sources of Multicollinearity
Effects of Multicollinearity
Multicollinearity Diagnostics
Examination of the Correlation Matrix
Variance Inflation Factors
Eigensystem Analysis of X'X
Other Diagnostics
SAS Code for Generating Multicollinearity Diagnostics
Methods for Dealing with Multicollinearity
Collecting Additional Data
Model Respecification
Ridge Regression
Principal-Component Regression
Comparison and Evaluation of Biased Estimators
Using SAS to Perform Ridge and Principal-Component Regression
Problems
Robust Regression
Need for Robust Regression
M-Estimators
Properties of Robust Estimators
Breakdown Point
Efficiency
Survey of Other Robust Regression Estimators
High-Breakdown-Point Estimators
Bounded Influence Estimators
Other Procedures
Computing Robust Regression Estimators
Problems
Introduction to Nonlinear Regression
Linear and Nonlinear Regression Models
Linear Regression Models
Nonlinear Regression Models
Origins of Nonlinear Models
Nonlinear Least Squares
Transformation to a Linear Model
Parameter Estimation in a Nonlinear System
Linearization
Other Parameter Estimation Methods
Starting Values
Computer Programs
Statistical Inference in Nonlinear Regression
Examples of Nonlinear Regression Models
Using SAS PROC NLIN
Problems
Generalized Linear Models
Introduction
Logistic Regression Models
Models with a Binary Response Variable
Estimating the Parameters in a Logistic Regression Model
Interpretation of the Parameters in a Logistic Regression Model
Statistical Inference on Model Parameters
Diagnostic Checking in Logistic Regression
Other Models for Binary Response Data
More Than Two Categorical Outcomes
Poisson Regression
The Generalized Linear Model
Link Functions and Linear Predictors
Parameter Estimation and Inference in the GLM
Prediction and Estimation with the GLM
Residual Analysis in the GLM
Overdispersion
Problems
Other Topics in the Use of Regression Analysis
Regression Models with Autocorrelated Errors
Source and Effects of Autocorrelation
Detecting the Presence of Autocorrelation
Parameter Estimation Methods
Effect of Measurement Errors in the Regressors
Simple Linear Regression
Berkson Model
Inverse Estimation-The Calibration Problem
Bootstrapping in Regression
Bootstrap Sampling in Regression
Bootstrap Confidence Intervals
Classification and Regression Trees (CART)
Neural Networks
Designed Experiments for Regression
Problems
Statistical Tables
Data Sets For Exercises
Supplemental Technical Material
Background on Basic Test Statistics
Background from the Theory of Linear Models
Important Results on SS[subscript R] and SS[subscript Res]
Gauss-Markov Theorem, Var([epsilon]) = [sigma superscript 2]I
Computational Aspects of Multiple Regression
Result on the Inverse of a Matrix
Development of the PRESS Statistic
Development of S[superscript 2 subscript (i)]
Outlier Test Based on R-Student
Independence of Residuals and Fitted Values
The Gauss-Markov Theorem, Var([epsilon]) = V
Bias in MS[subscript Res] When the Model Is Underspecified
Computation of Influence Diagnostics
Generalized Linear Models
Introduction to SAS
Basic Data Entry
Creating Permanent SAS Data Sets
Importing Data from an EXCEL File
Output Command
Log File
Adding Variables to an Existing SAS Data Set
References
Index