Introduction to Linear Regression Analysis

ISBN-10: 0471754951
ISBN-13: 9780471754954
Edition: 4th 2006 (Revised)
List price: $150.00 Buy it from $45.65
This item qualifies for FREE shipping

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy

Description: A comprehensive and up-to-date introduction to the fundamentals of regression analysis The Fourth Edition of Introduction to Linear Regression Analysis describes both the conventional and less common uses of linear regression in the practical  More...

New Starting from $152.15
what's this?
Rush Rewards U
Members Receive:
coins
coins
You have reached 400 XP and carrot coins. That is the daily max!
You could win $10,000

Get an entry for every item you buy, rent, or sell.

Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Add to cart
Study Briefs
Italian Grammar Online content $4.95 $1.99
Add to cart
Study Briefs
Portuguese Grammar Online content $4.95 $1.99
Add to cart
Study Briefs
Spanish Grammar Online content $4.95 $1.99
Add to cart
Study Briefs
German Grammar Online content $4.95 $1.99

Customers also bought

Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading

Book details

List price: $150.00
Edition: 4th
Copyright year: 2006
Publisher: John Wiley & Sons, Incorporated
Publication date: 7/21/2006
Binding: Hardcover
Pages: 640
Size: 7.25" wide x 10.25" long x 1.25" tall
Weight: 2.684
Language: English

A comprehensive and up-to-date introduction to the fundamentals of regression analysis The Fourth Edition of Introduction to Linear Regression Analysis describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research. This popular book blends both theory and application to equip the reader with an understanding of the basic principles necessary to apply regression model-building techniques in a wide variety of application environments. It assumes a working knowledge of basic statistics and a familiarity with hypothesis testing and confidence intervals, as well as the normal, t, x2, and F distributions. Illustrating all of the major procedures employed by the contemporary software packages MINITAB(r), SAS(r), and S-PLUS(r), the Fourth Edition begins with a general introduction to regression modeling, including typical applications. A host of technical tools are outlined, such as basic inference procedures, introductory aspects of model adequacy checking, and polynomial regression models and their variations. The book discusses how transformations and weighted least squares can be used to resolve problems of model inadequacy and also how to deal with influential observations. Subsequent chapters discuss: * Indicator variables and the connection between regression and analysis-of-variance models * Variable selection and model-building techniques and strategies * The multicollinearity problem--its sources, effects, diagnostics, and remedial measures * Robust regression techniques such as M-estimators, and properties of robust estimators * The basics of nonlinear regression * Generalized linear models * Using SAS(r) for regression problems This book is a robust resource that offers solid methodology for statistical practitioners and professionals in the fields of engineering, physical and chemical sciences, economics, management, life and biological sciences, and the social sciences. Both the accompanying FTP site, which contains data sets, extensive problem solutions, software hints, and PowerPoint(r) slides, as well as the book's revised presentation of topics in increasing order of complexity, facilitate its use in a classroom setting. With its new exercises and structure, this book is highly recommended for upper-undergraduate and beginning graduate students in mathematics, engineering, and natural sciences. Scientists and engineers will find the book to be an excellent choice for reference and self-study.

Preface
Introduction
Regression and Model Building
Data Collection
Uses of Regression
Role of the Computer
Simple Linear Regression
Simple Linear Regression Model
Least-Squares Estimation of the Parameters
Estimation of [beta subscript 0] and [beta subscript 1]
Properties of the Least-Squares Estimators and the Fitted Regression Model
Estimation of [sigma superscript 2]
Alternate Form of the Model
Hypothesis Testing on the Slope and Intercept
Use of t Tests
Testing Significance of Regression
Analysis of Variance
Interval Estimation in Simple Linear Regression
Confidence Intervals on [beta superscript 0], [beta superscript 1], and [sigma subscript 2]
Interval Estimation of the Mean Response
Prediction of New Observations
Coefficient of Determination
Using SAS for Simple Linear Regression
Some Considerations in the Use of Regression
Regression Through the Origin
Estimation by Maximum Likelihood
Case Where the Regressor x is Random
x and y Jointly Distributed
x and y Jointly Normally Distributed: Correlation Model
Problems
Multiple Linear Regression
Multiple Regression Models
Estimation of the Model Parameters
Least-Squares Estimation of the Regression Coefficients
Geometrical Interpretation of Least Squares
Properties of the Least-Squares Estimators
Estimation of [sigma superscript 2]
Inadequacy of Scatter Diagrams in Multiple Regression
Maximum-Likelihood Estimation
Hypothesis Testing in Multiple Linear Regression
Test for Significance of Regression
Tests on Individual Regression Coefficients
Special Case of Orthogonal Columns in X
Testing the General Linear Hypothesis
Confidence Intervals in Multiple Regression
Confidence Intervals on the Regression Coefficients
Confidence Interval Estimation of the Mean Response
Simultaneous Confidence Intervals on Regression Coefficients
Prediction of New Observations
Using SAS for Basic Multiple Linear Regression
Hidden Extrapolation in Multiple Regression
Standardized Regression Coefficients
Multicollinearity
Why Do Regression Coefficients Have the Wrong Sign?
Problems
Model Adequacy Checking
Introduction
Residual Analysis
Definition of Residuals
Methods for Scaling Residuals
Residual Plots
Partial Regression and Partial Residual Plots
Using MINITAB and SAS for Residual Analysis
Other Residual Plotting and Analysis Methods
PRESS Statistic
Detection and Treatment of Outliers
Lack of Fit of the Regression Model
Formal Test for Lack of Fit
Estimation of Pure Error from Near Neighbors
Problems
Transformations and Weighting to Correct Model Inadequacies
Introduction
Variance-Stabilizing Transformations
Transformations to Linearize the Model
Analytical Methods for Selecting a Transformation
Transformations on y: The Box-Cox Method
Transformations on the Regressor Variables
Generalized and Weighted Least Squares
Generalized Least Squares
Weighted Least Squares
Some Practical Issues
Problems
Diagnostics for Leverage and Influence
Importance of Detecting Influential Observations
Leverage
Measures of Influence: Cook's D
Measures of Influence: DFFITS and DFBETAS
A Measure of Model Performance
Detecting Groups of Influential Observations
Treatment of Influential Observations
Problems
Polynomial Regression Models
Introduction
Polynomial Models in One Variable
Basic Principles
Piecewise Polynomial Fitting (Splines)
Polynomial and Trigonometric Terms
Nonparametric Regression
Kernel Regression
Locally Weighted Regression (Loess)
Final Cautions
Polynomial Models in Two or More Variables
Orthogonal Polynomials
Problems
Indicator Variables
General Concept of Indicator Variables
Comments on the Use of Indicator Variables
Indicator Variables versus Regression on Allocated Codes
Indicator Variables as a Substitute for a Quantitative Regressor
Regression Approach to Analysis of Variance
Problems
Variable Selection and Model Building
Introduction
Model-Building Problem
Consequences of Model Misspecification
Criteria for Evaluating Subset Regression Models
Computational Techniques for Variable Selection
All Possible Regressions
Stepwise Regression Methods
Strategy for Variable Selection and Model Building
Case Study: Gorman and Toman Asphalt Data Using SAS
Problems
Validation of Regression Models
Introduction
Validation Techniques
Analysis of Model Coefficients and Predicted Values
Collecting Fresh Data-Confirmation Runs
Data Splitting
Data from Planned Experiments
Problems
Multicollinearity
Introduction
Sources of Multicollinearity
Effects of Multicollinearity
Multicollinearity Diagnostics
Examination of the Correlation Matrix
Variance Inflation Factors
Eigensystem Analysis of X'X
Other Diagnostics
SAS Code for Generating Multicollinearity Diagnostics
Methods for Dealing with Multicollinearity
Collecting Additional Data
Model Respecification
Ridge Regression
Principal-Component Regression
Comparison and Evaluation of Biased Estimators
Using SAS to Perform Ridge and Principal-Component Regression
Problems
Robust Regression
Need for Robust Regression
M-Estimators
Properties of Robust Estimators
Breakdown Point
Efficiency
Survey of Other Robust Regression Estimators
High-Breakdown-Point Estimators
Bounded Influence Estimators
Other Procedures
Computing Robust Regression Estimators
Problems
Introduction to Nonlinear Regression
Linear and Nonlinear Regression Models
Linear Regression Models
Nonlinear Regression Models
Origins of Nonlinear Models
Nonlinear Least Squares
Transformation to a Linear Model
Parameter Estimation in a Nonlinear System
Linearization
Other Parameter Estimation Methods
Starting Values
Computer Programs
Statistical Inference in Nonlinear Regression
Examples of Nonlinear Regression Models
Using SAS PROC NLIN
Problems
Generalized Linear Models
Introduction
Logistic Regression Models
Models with a Binary Response Variable
Estimating the Parameters in a Logistic Regression Model
Interpretation of the Parameters in a Logistic Regression Model
Statistical Inference on Model Parameters
Diagnostic Checking in Logistic Regression
Other Models for Binary Response Data
More Than Two Categorical Outcomes
Poisson Regression
The Generalized Linear Model
Link Functions and Linear Predictors
Parameter Estimation and Inference in the GLM
Prediction and Estimation with the GLM
Residual Analysis in the GLM
Overdispersion
Problems
Other Topics in the Use of Regression Analysis
Regression Models with Autocorrelated Errors
Source and Effects of Autocorrelation
Detecting the Presence of Autocorrelation
Parameter Estimation Methods
Effect of Measurement Errors in the Regressors
Simple Linear Regression
Berkson Model
Inverse Estimation-The Calibration Problem
Bootstrapping in Regression
Bootstrap Sampling in Regression
Bootstrap Confidence Intervals
Classification and Regression Trees (CART)
Neural Networks
Designed Experiments for Regression
Problems
Statistical Tables
Data Sets For Exercises
Supplemental Technical Material
Background on Basic Test Statistics
Background from the Theory of Linear Models
Important Results on SS[subscript R] and SS[subscript Res]
Gauss-Markov Theorem, Var([epsilon]) = [sigma superscript 2]I
Computational Aspects of Multiple Regression
Result on the Inverse of a Matrix
Development of the PRESS Statistic
Development of S[superscript 2 subscript (i)]
Outlier Test Based on R-Student
Independence of Residuals and Fitted Values
The Gauss-Markov Theorem, Var([epsilon]) = V
Bias in MS[subscript Res] When the Model Is Underspecified
Computation of Influence Diagnostics
Generalized Linear Models
Introduction to SAS
Basic Data Entry
Creating Permanent SAS Data Sets
Importing Data from an EXCEL File
Output Command
Log File
Adding Variables to an Existing SAS Data Set
References
Index

×
Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.

×