Our Privacy Policy has changed. By using this site, you agree to the Privacy Policy.

Linear Models and Generalizations Least Squares and Alternatives

ISBN-10: 3540742263
ISBN-13: 9783540742265
Edition: 3rd 2008
List price: $139.00
30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy

what's this?
Rush Rewards U
Members Receive:
You have reached 400 XP and carrot coins. That is the daily max!
You could win $10,000

Get an entry for every item you buy, rent, or sell.

Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Add to cart
Study Briefs
Calculus 1 Online content $4.95 $1.99
Add to cart
Study Briefs
SQL Online content $4.95 $1.99
Add to cart
Study Briefs
MS Excel® 2010 Online content $4.95 $1.99
Add to cart
Study Briefs
MS Word® 2010 Online content $4.95 $1.99

Customers also bought


Book details

List price: $139.00
Edition: 3rd
Copyright year: 2008
Publisher: Springer
Publication date: 10/12/2007
Binding: Hardcover
Pages: 572
Size: 6.50" wide x 9.25" long x 1.50" tall
Weight: 2.2
Language: English

C. R. Rao, born in India, is one of this century's foremost statisticians, and received his education in statistics at the Indian Statistical Institute (ISI), Calcutta. He is Emeritus Holder of the Eberly Family Chair in Statistics at Penn State and Director of the Center for Multivariate Analysis. He has long been recognized as one of the world's top statisticians, and has been awarded 34 honorary doctorates from universities in 19 countries spanning 6 continents. His research has influenced not only statistics, but also the physical, social and natural sciences and engineering. In 2011 he was recipient of the Royal Statistical Society's Guy Medal in Gold which is awarded triennially to those "who are judged to have merited a signal mark of distinction by reason of their innovative contributions to the theory or application of statistics". It can be awarded both to fellows (members) of the Society and to non-fellows. Since its inception 120 years ago the Gold Medal has been awarded to 34 distinguished statisticians. The first medal was awarded to Charles Booth in 1892. Only two statisticians, H. Cramer (Norwegian) and J. Neyman (Polish), outside Great Britain were awarded the Gold medal and C. R. Rao is the first non-European and non-American to receive the award. Other awards he has received are the Gold Medal of Calcutta University, Wilks Medal of the American Statistical Association, Wilks Army Medal, Guy Medal in Silver of the Royal Statistical Society (UK), Megnadh Saha Medal and Srinivasa Ramanujan Medal of the Indian National Science Academy, J.C.Bose Gold Medal of Bose Institute and Mahalanobis Centenary Gold Medal of the Indian Science Congress, the Bhatnagar award of the Council of Scientific and Industrial Research, India and the Government of India honored him with the second highest civilian award, Padma Vibhushan, for "outstanding contributions to Science and Engineering / Statistics", and also instituted a cash award in honor of C R Rao, "to be given once in two years to a young statistician for work done during the preceding 3 years in any field of statistics". For his outstanding achievements Rao has been honored with the establishment of an institute named after him, C.R.Rao Advanced Institute for Mathematics, Statistics and Computer Science, in the campus of the University of Hyderabad, India.

Preface to the First Edition
Preface to the Second Edition
Preface to the Third Edition
Linear Models and Regression Analysis
Plan of the Book
The Simple Linear Regression Model
The Linear Model
Least Squares Estimation
Direct Regression Method
Properties of the Direct Regression Estimators
Centered Model
No Intercept Term Model
Maximum Likelihood Estimation
Testing of Hypotheses and Confidence Interval Estimation
Analysis of Variance
Goodness of Fit of Regression
Reverse Regression Method
Orthogonal Regression Method
Reduced Major Axis Regression Method
Least Absolute Deviation Regression Method
Estimation of Parameters when X Is Stochastic
The Multiple Linear Regression Model and Its Extensions
The Linear Model
The Principle of Ordinary Least Squares (OLS)
Geometric Properties of OLS
Best Linear Unbiased Estimation
Basic Theorems
Linear Estimators
Mean Dispersion Error
Estimation (Prediction) of the Error Term � and �<sup>2</sup>
Classical Regression under Normal Errors
The Maximum-Likelihood (ML) Principle
Maximum Likelihood Estimation in Classical Normal Regression
Consistency of Estimators
Testing Linear Hypotheses
Analysis of Variance
Goodness of Fit
Checking the Adequacy of Regression Analysis
Univariate Regression
Multiple Regression
A Complex Example
Graphical Presentation
Linear Regression with Stochastic Regressors
Regression and Multiple Correlation Coefficient
Heterogenous Linear Estimation without Normality
Heterogeneous Linear Estimation under Normality
The Canonical Form
Identification and Quantification of Multicollinearity
Principal Components Regression
Ridge Estimation
Shrinkage Estimates
Partial Least Squares
Tests of Parameter Constancy
The Chow Forecast Test
The Hansen Test
Tests with Recursive Estimation
Test for Structural Change
Total Least Squares
Minimax Estimation
Inequality Restrictions
The Minimax Principle
Censored Regression
LAD Estimators and Asymptotic Normality
Tests of Linear Hypotheses
Simultaneous Confidence Intervals
Confidence Interval for the Ratio of Two Linear Parametric Functions
Nonparametric Regression
Estimation of the Regression Function
Classification and Regression Trees (CART)
Boosting and Bagging
Projection Pursuit Regression
Neural Networks and Nonparametric Regression
Logistic Regression and Neural Networks
Functional Data Analysis (FDA)
Restricted Regression
Problem of Selection
Theory of Restricted Regression
Efficiency of Selection
Explicit Solution in Special Cases
LINEX Loss Function
Balanced Loss Function
Linear Models without Moments: Exercise
Nonlinear Improvement of OLSE for Nonnormal Disturbances
A Characterization of the Least Squares Estimator
A Characterization of the Least Squares Estimator: A Lemma
The Generalized Linear Regression Model
Optimal Linear Estimation of �
R<sub>1</sub>-Optimal Estimators
R<sub>2</sub>-Optimal Estimators
R<sub>3</sub>-Optimal Estimators
The Aitken Estimator
Misspecification of the Dispersion Matrix
Heteroscedasticity and Autoregression
Mixed Effects Model: Unified Theory of Linear Estimation
Mixed Effects Model
A Basic Lemma
Estimation of X� (the Fixed Effect)
Prediction of U� (the Random Effect)
Estimation of &epsiv;
Linear Mixed Models with Normal Errors and Random Effects
Maximum Likelihood Estimation of Linear Mixed Models
Restricted Maximum Likelihood Estimation of Linear Mixed Models
Inference for Linear Mixed Models
Regression-Like Equations in Econometrics
Econometric Models
The Reduced Form
The Multivariate Regression Model
The Classical Multivariate Linear Regression Model
Stochastic Regression
Instrumental Variable Estimator
Seemingly Unrelated Regressions
Measurement Error Models
Simultaneous Parameter Estimation by Empirical Bayes Solutions
Estimation of Parameters from Different Linear Models
Gauss-Markov, Aitken and Rao Least Squares Estimators
Gauss-Markov Least Squares
Aitken Least Squares
Rao Least Squares
Exact and Stochastic Linear Restrictions
Use of Prior Information
The Restricted Least-Squares Estimator
Maximum Likelihood Estimation under Exact Restrictions
Stepwise Inclusion of Exact Linear Restrictions
Biased Linear Restrictions and MDE Comparison with the OLSE
MDE Matrix Comparisons of Two Biased Estimators
MDE Matrix Comparison of Two Linear Biased Estimators
MDE Comparison of Two (Biased) Restricted Estimators
Stein-Rule Estimators under Exact Restrictions
Stochastic Linear Restrictions
Mixed Estimator
Assumptions about the Dispersion Matrix
Biased Stochastic Restrictions
Stein-Rule Estimators under Stochastic Restrictions
Weakened Linear Restrictions
Weakly (R, r)-Unbiasedness
Optimal Weakly (R, r)-Unbiased Estimators
Feasible Estimators-Optimal Substitution of � in <$>\hat {\beta}_1<$> (�, A)
RLSE instead of the Mixed Estimator
Prediction in the Generalized Regression Model
Some Simple Linear Models
The Constant Mean Model
The Linear Trend Model
Polynomial Models
The Prediction Model
Optimal Heterogeneous Prediction
Optimal Homogeneous Prediction
MDE Matrix Comparisons between Optimal and Classical Predictors
Comparison of Classical and Optimal Prediction with Respect to the y&ast; Superiority
Comparison of Classical and Optimal Predictors with Respect to the X&ast;� Superiority
Prediction Regions
Concepts and Definitions
On q-Prediction Intervals
On q-Intervals in Regression Analysis
On (p, q)-Prediction Intervals
Linear Utility Functions
Normally Distributed Populations - Two-Sided Symmetric Intervals
Onesided Infinite Intervals
Utility and Length of Intervals
Utility and coverage
Maximal Utility and Optimal Tests
Prediction Ellipsoids Based on the GLSE
Comparing the Efficiency of Prediction Ellipsoids
Simultaneous Prediction of Actual and Average Values of y
Specification of Target Function
Exact Linear Restrictions
MDEP Using Ordinary Least Squares Estimator
MDEP Using Restricted Estimator
MDEP Matrix Comparison
Stein-Rule Predictor
Outside Sample Predictions
Kalman Filter
Dynamical and Observational Equations
Some Theorems
Kalman Model
Sensitivity Analysis
Prediction Matrix
Effect of Single Observation on Estimation of Parameters
Measures Based on Residuals
Algebraic Consequences of Omitting an Observation
Detection of Outliers
Diagnostic Plots for Testing the Model Assumptions
Measures Based on the Confidence Ellipsoid
Partial Regression Plots
Regression Diagnostics for Removing an Observation with Graphics
Model Selection Criteria
Akaikes Information Criterion
Bayesian Information Criterion
Mallows C<sub>p</sub>
Analysis of Incomplete Data Sets
Statistical Methods with Missing Data
Complete Case Analysis
Available Case Analysis
Filling in the Missing Values
Model-Based Procedures
Missing-Data Mechanisms
Missing Indicator Matrix
Missing Completely at Random
Missing at Random
Nonignorable Nonresponse
Missing Pattern
Missing Data in the Response
Least-Squares Analysis for Filled-up Data-Yates Procedure
Analysis of Covariance-Bartlett's Method
Shrinkage Estimation by Yates Procedure
Shrinkage Estimators
Efficiency Properties
Missing Values in the X-Matrix
General Model
Missing Values and Loss in Efficiency
Methods for Incomplete X-Matrices
Complete Case Analysis
Available Case Analysis
Maximum-Likelihood Methods
Imputation Methods for Incomplete X-Matrices
Maximum-Likelihood Estimates of Missing Values
Zero-Order Regression
First-Order Regression
Multiple Imputation
Weighted Mixed Regression
The Two-Stage WMRE
Assumptions about the Missing Mechanism
Regression Diagnostics to Identify Non-MCAR Processes
Comparison of the Means
Comparing the Variance-Covariance Matrices
Diagnostic Measures from Sensitivity Analysis
Distribution of the Measures and Test Procedure
Treatment of Nonignorable Nonresponse
Joint Distribution of (X,Y) with Missing Values Only in Y
Conditional Distribution of Y Given X with Missing Values Only in Y
Conditional Distribution of Y Given X with Missing Values Only in X
Other Approaches
Further Literature
Robust Regression
Least Absolute Deviation Estimators - Univariate Case
M-Estimates: Univariate Case
Asymptotic Distributions of LAD Estimators
Univariate Case
Multivariate Case
General M-Estimates
Tests of Significance
Models for Categorical Response Variables
Generalized Linear Models
Extension of the Regression Model
Structure of the Generalized Linear Model
Score Function and Information Matrix
Maximum-Likelihood Estimation
Testing of Hypotheses and Goodness of Fit
Quasi Loglikelihood
Contingency Tables
Ways of Comparing Proportions
Sampling in Two-Way Contingency Tables
Likelihood Function and Maximum-Likelihood Estimates
Testing the Goodness of Fit
GLM for Binary Response
Logit Models and Logistic Regression
Testing the Model
Distribution Function as a Link Function
Logit Models for Categorical Data
Goodness of Fit-Likelihood-Ratio Test
Loglinear Models for Categorical Variables
Two-Way Contingency Tables
Three-Way Contingency Tables
The Special Case of Binary Response
Coding of Categorical Explanatory Variables
Dummy and Effect Coding
Coding of Response Models
Coding of Models for the Hazard Rate
Extensions to Dependent Binary Variables
Modeling Approaches for Correlated Response
Quasi-Likelihood Approach for Correlated Binary Response
The GEE Method by Liang and Zeger
Properties of the GEE Estimate <$>\hat {\beta}_G<$>
Efficiency of the GEE and IEE Methods
Choice of the Quasi-Correlation Matrix R<sub>t</sub>(�)
Bivariate Binary Correlated Response Variables
The GEE Method
The IEE Method
An Example from the Field of Dentistry
Full Likelihood Approach for Marginal Models
Matrix Algebra
Trace of a Matrix
Determinant of a Matrix
Inverse of a Matrix
Orthogonal Matrices
Rank of a Matrix
Range and Null Space
Eigenvalues and Eigenvectors
Decomposition of Matrices
Definite Matrices and Quadratic Forms
Idempotent Matrices
Generalized Inverse
Functions of Normally Distributed Variables
Differentiation of Scalar Functions of Matrices
Miscellaneous Results, Stochastic Convergence
Software for Linear Regression Models
Special-Purpose Software

Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.