Cargando…

Foundations of Linear and Generalized Linear Models

A valuable overview of the most important ideas and results in statistical modeling Written by a highly-experienced author, Foundations of Linear and Generalized Linear Models is a clear and comprehensive guide to the key concepts and results of linearstatistical models. The book presents a broad, i...

Descripción completa

Detalles Bibliográficos
Clasificación:Libro Electrónico
Autor principal: Agresti, Alan
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated, 2015.
Colección:New York Academy of Sciences Ser.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Intro
  • Foundations of Linear and Generalized Linear Models
  • Contents
  • Preface
  • Purpose of this book
  • Use as a textbook
  • Acknowledgments
  • 1 Introduction to Linear and Generalized Linear Models
  • 1.1 Components of a Generalized Linear Model
  • 1.1.1 Random Component of a GLM
  • 1.1.2 Linear Predictor of a GLM
  • 1.1.3 Link Function of a GLM
  • 1.1.4 A GLM with Identity Link Function is a "Linear Model"
  • 1.1.5 GLMs for Normal, Binomial, and Poisson Responses
  • 1.1.6 Advantages of GLMs versus Transforming the Data
  • 1.2 Quantitative/Qualitative Explanatory Variables and Interpreting Effects
  • 1.2.1 Quantitative and Qualitative Variables in Linear Predictors
  • 1.2.2 Interval, Nominal, and Ordinal Variables
  • 1.2.3 Interpreting Effects in Linear Models
  • 1.3 Model Matrices and Model Vector Spaces
  • 1.3.1 Model Matrices Induce Model Vector Spaces
  • 1.3.2 Dimension of Model Space Equals Rank of Model Matrix
  • 1.3.3 Example: The One-Way Layout
  • 1.4 Identifiability and Estimability
  • 1.4.1 Identifiability of GLM Model Parameters
  • 1.4.2 Estimability in Linear Models
  • 1.5 Example: Using Software to Fit a GLM
  • 1.5.1 Example: Male Satellites for Female Horseshoe Crabs
  • 1.5.2 Linear Model Using Weight to Predict Satellite Counts
  • 1.5.3 Comparing Mean Numbers of Satellites by Crab Color
  • Chapter Notes
  • Exercises
  • 2 Linear Models: Least Squares Theory
  • 2.1 Least Squares Model Fitting
  • 2.1.1 The Normal Equations and Least Squares Solution
  • 2.1.2 Hat Matrix and Moments of Estimators
  • 2.1.3 Bivariate Linear Model and Regression Toward the Mean
  • 2.1.4 Least Squares Solutions When X Does Not Have Full Rank
  • 2.1.5 Orthogonal Subspaces and Residuals
  • 2.1.6 Alternatives to Least Squares
  • 2.2 Projections of Data Onto Model Spaces
  • 2.2.1 Projection Matrices
  • 2.2.2 Projection Matrices for Linear Model Spaces
  • 2.2.3 Example: The Geometry of a Linear Model
  • 2.2.4 Orthogonal Columns and Parameter Orthogonality
  • 2.2.5 Pythagoras's Theorem Applications for Linear Models
  • 2.3 Linear Model Examples: Projections and SS Decompositions
  • 2.3.1 Example: Null Model
  • 2.3.2 Example: Model for the One-way Layout
  • 2.3.3 Sums of Squares and ANOVA Table for One-Way Layout
  • 2.3.4 Example: Model for Two-Way Layout with Randomized Block Design
  • 2.4 Summarizing Variability in a Linear Model
  • 2.4.1 Estimating the Error Variance for a Linear Model
  • 2.4.2 Sums of Squares: Error (SSE) and Regression (SSR)
  • 2.4.3 Effect on SSR and SSE of Adding Explanatory Variables
  • 2.4.4 Sequential and Partial Sums of Squares
  • 2.4.5 Uncorrelated Predictors: Sequential SS = Partial SS = SSR Component
  • 2.4.6 R-Squared and the Multiple Correlation
  • 2.5 Residuals, Leverage, and Influence
  • 2.5.1 Residuals and Fitted Values Are Uncorrelated
  • 2.5.2 Plots of Residuals