Mixed Models : Theory and Applications with R.
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Wiley,
2013.
|
Colección: | Wiley series in probability and statistics ;
893. |
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Cover
- Title Page
- Copyright Page
- Dedication
- Contents
- Preface
- Preface to the Second Edition
- R Software and Functions
- Data Sets
- Open Problems in Mixed Models
- 1 Introduction: Why Mixed Models?
- 1.1 Mixed effects for clustered data
- 1.2 ANOVA, variance components, and the mixed model
- 1.3 Other special cases of the mixed effects model
- 1.4 Compromise between Bayesian and frequentist approaches
- 1.5 Penalized likelihood and mixed effects
- 1.6 Healthy Akaike information criterion
- 1.7 Penalized smoothing
- 1.8 Penalized polynomial fitting
- 1.9 Restraining parameters, or what to eat
- 1.10 Ill-posed problems, Tikhonov regularization, and mixed effects
- 1.11 Computerized tomography and linear image reconstruction
- 1.12 GLMM for PET
- 1.13 Maple leaf shape analysis
- 1.14 DNA Western blot analysis
- 1.15 Where does the wind blow?
- 1.16 Software and books
- 1.17 Summary points
- 2 MLE for the LME Model
- 2.1 Example: weight versus height
- 2.1.1 The first R script
- 2.2 The model and log-likelihood functions
- 2.2.1 The model
- 2.2.2 Log-likelihood functions
- 2.2.3 Dimension-reduction formulas
- 2.2.4 Profile log-likelihood functions
- 2.2.5 Dimension-reduction GLS estimate
- 2.2.6 Restricted maximum likelihood
- 2.2.7 Weight versus height (continued)
- 2.3 Balanced random-coefficient model
- 2.4 LME model with random intercepts
- 2.4.1 Balanced random-intercept model
- 2.4.2 How random effect affects the variance of MLE
- 2.5 Criterion for MLE existence
- 2.6 Criterion for the positive definiteness of matrix D
- 2.6.1 Example of an invalid LME model
- 2.7 Pre-estimation bounds for variance parameters
- 2.8 Maximization algorithms
- 2.9 Derivatives of the log-likelihood function
- 2.10 Newton-Raphson algorithm
- 2.11 Fisher scoring algorithm
- 2.11.1 Simplified FS algorithm
- 2.11.2 Empirical FS algorithm
- 2.11.3 Variance-profile FS algorithm
- 2.12 EM algorithm
- 2.12.1 Fixed-point algorithm
- 2.13 Starting point
- 2.13.1 FS starting point
- 2.13.2 FP starting point
- 2.14 Algorithms for restricted MLE
- 2.14.1 Fisher scoring algorithm
- 2.14.2 EM algorithm
- 2.15 Optimization on nonnegative definite matrices
- 2.15.1 How often can one hit the boundary?
- 2.15.2 Allow matrix D to be not nonnegative definite
- 2.15.3 Force matrix D to stay nonnegative definite
- 2.15.4 Matrix D reparameterization
- 2.15.5 Criteria for convergence
- 2.16 lmeFS and lme in R
- 2.17 Appendix: proof of the existence of MLE
- 2.18 Summary points
- 3 Statistical Properties of the LME Model
- 3.1 Introduction
- 3.2 Identifiability of the LME model
- 3.2.1 Linear regression with random coefficients
- 3.3 Information matrix for variance parameters
- 3.3.1 Efficiency of variance parameters for balanced data
- 3.4 Profile-likelihood confidence intervals
- 3.5 Statistical testing of the presence of random effects