Cargando…

Linear Models

Detalles Bibliográficos
Autor principal: Searle, Shayle R.
Formato: Electrónico eBook
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated, 2016.
Colección:New York Academy of Sciences Ser.
Temas:
Acceso en línea:Texto completo
Tabla de Contenidos:
  • Intro
  • Linear Models
  • Contents
  • Preface
  • Preface to First Edition
  • About the Companion Website
  • Introduction and Overview
  • 1 Generalized Inverse Matrices
  • 1 Introduction
  • a Definition and Existence of a Generalized Inverse
  • b An Algorithm for Obtaining a Generalized Inverse
  • c Obtaining Generalized Inverses Using the Singular Value Decomposition (SVD)
  • 2 Solving Linear Equations
  • a Consistent Equations
  • b Obtaining Solutions
  • c Properties of Solutions
  • 3 The Penrose Inverse
  • 4 Other Definitions
  • 5 Symmetric Matrices
  • a Properties of a Generalized Inverse
  • B Two More Generalized Inverses of
  • 6 Arbitrariness in a Generalized Inverse
  • 7 Other Results
  • 8 Exercises
  • 2 Distributions and Quadratic Forms
  • 1 Introduction
  • 2 Symmetric Matrices
  • 3 Positive Definiteness
  • 4 Distributions
  • a Multivariate Density Functions
  • b Moments
  • c Linear Transformations
  • d Moment and Cumulative Generating Functions
  • e Univariate Normal
  • f Multivariate Normal
  • g Central 2, F, and t
  • h Non-central 2
  • i Non-central F
  • j The Non-central t Distribution
  • 5 Distribution of Quadratic Forms
  • a Cumulants
  • b Distributions
  • c Independence
  • 6 Bilinear Forms
  • 7 Exercises
  • 3 Regression for the Full-Rank Model
  • 1 Introduction
  • a The Model
  • b Observations
  • c Estimation
  • d The General Case of k x Variables
  • e Intercept and No-Intercept Models
  • 2 Deviations From Means
  • 3 Some Methods of Estimation
  • a Ordinary Least Squares
  • b Generalized Least Squares
  • c Maximum Likelihood
  • d The Best Linear Unbiased Estimator (b.l.u.e.)(Gauss-Markov Theorem)
  • e Least-squares Theory When The Parameters are Random Variables
  • 4 Consequences of Estimation
  • a Unbiasedness
  • b Variances
  • c Estimating E(y)
  • D Residual Error Sum of Squares
  • e Estimating the Residual Error Variance
  • f Partitioning the Total Sum of Squares
  • g Multiple Correlation
  • 5 Distributional Properties
  • a The Vector of Observations y is Normal
  • b The Least-square Estimator ̂b is Normal
  • c The Least-square Estimator ̂b and the Estimator of the Variance ̂ 2 are Independent
  • d The Distribution of SSE/2 is a 2 Distribution
  • e Non-central 2′ s
  • f F-distributions
  • g Analyses of Variance
  • h Tests of Hypotheses
  • i Confidence Intervals
  • j More Examples
  • k Pure Error
  • 6 The General Linear Hypothesis
  • A Testing Linear Hypothesis
  • b Estimation Under the Null Hypothesis
  • c Four Common Hypotheses
  • d Reduced Models
  • e Stochastic Constraints
  • f Exact Quadratic Constraints (Ridge Regression)
  • 7 Related Topics
  • a The Likelihood Ratio Test
  • b Type I and Type II Errors
  • c The Power of a Test
  • d Estimating Residuals
  • 8 Summary of Regression Calculations
  • 9 Exercises
  • 4 Introducing Linear Models: Regression on Dummy Variables
  • 1 Regression on Allocated Codes
  • a Allocated Codes
  • b Difficulties and Criticism
  • c Grouped Variables
  • d Unbalanced Data