Cargando…

Methods and Applications of Linear Models Regression and the Analysis of Variance.

Detalles Bibliográficos
Autor principal: Hocking, Ronald R.
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated, 2013.
Colección:New York Academy of Sciences Ser.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Intro
  • Methods and Applications of Linear Models
  • Contents
  • Preface to the Third Edition
  • Preface to the Second Edition
  • Preface to the First Edition
  • PART I REGRESSION
  • 1 Introduction to Linear Models
  • 1.1 Background Information
  • 1.2 Mathematical and Statistical Models
  • 1.3 Definition of the Linear Model
  • 1.4 Examples of Regression Models
  • 1.4.1 Single-Variable, Regression Model
  • 1.4.2 Regression Models with Several Inputs
  • 1.4.3 Discrete Response Variables
  • 1.4.4 Multivariate Linear Models
  • 1.5 Concluding Comments
  • Exercises
  • 2 Regression on Functions of One Variable
  • 2.1 The Simple Linear Regression Model
  • 2.2 Parameter Estimation
  • 2.2.1 Least Squares Estimation
  • 2.2.2 Maximum Likelihood Estimation
  • 2.2.3 Coded Data: Centering and Scaling
  • 2.2.4 The Analysis of Variance Table
  • 2.3 Properties of the Estimators and Test Statistics
  • 2.3.1 Moments of Linear Functions of Random Variables
  • 2.3.2 Moments of Least Squares Estimators
  • 2.3.3 Distribution of the Least Squares Estimators
  • 2.3.4 The Distribution of Test Statistics
  • 2.4 The Analysis of Simple Linear Regression Models
  • 2.4.1 Two Numerical Examples
  • 2.4.2 A Test for Lack-of-Fit
  • 2.4.3 Inference on the Parameters of the Model
  • 2.4.4 Prediction and Prediction Intervals
  • 2.5 Examining the Data and the Model
  • 2.5.1 Residuals
  • 2.5.2 Outliers, Extreme Points, and Influence
  • 2.5.3 Normality, Independence, and Variance Homogeneity
  • 2.6 Polynomial Regression Models
  • 2.6.1 The Quadratic Model
  • 2.6.2 Higher Ordered Polynomial Models
  • 2.6.3 Orthogonal Polynomials
  • 2.6.4 Regression through the Origin
  • Exercises
  • 3 Transforming the Data
  • 3.1 The Need for Transformations
  • 3.2 Weighted Least Squares
  • 3.3 Variance Stabilizing Transformations
  • 3.4 Transformations to Achieve a Linear Model
  • 3.4.1 Transforming the Dependent Variable
  • 3.4.2 Transforming the Predictors
  • 3.5 Analysis of the Transformed Model
  • 3.5.1 Transformations with Forbes Data
  • Exercises
  • 4 Regression on Functions of Several Variables
  • 4.1 The Multiple Linear Regression Model
  • 4.2 Preliminary Data Analysis
  • 4.3 Analysis of the Multiple Linear Regression Model
  • 4.3.1 Fitting the Model in Centered Form
  • 4.3.2 Estimation and Analysis of the Original Data
  • 4.3.3 Model Assessment and Residual Analysis
  • 4.3.4 Prediction
  • 4.3.5 Transforming the Response
  • 4.4 Partial Correlation and Added-Variable Plots
  • 4.4.1 Partial Correlation
  • 4.4.2 Added-Variable Plots
  • 4.4.3 Simple Versus Partial Correlation
  • 4.5 Variable Selection
  • 4.5.1 The Case of Orthogonal Predictors
  • 4.5.2 Criteria for Deletion of Variables
  • 4.5.3 Nonorthogonal Predictors
  • 4.5.4 Computational Considerations
  • 4.5.5 Selection Strategies
  • 4.6 Model Specification
  • 4.6.1 Application to Subset Selection
  • 4.6.2 Improved Mean Squared Error