Probability and conditional expectation : fundamentals for the empirical sciences /
Probability and Conditional Expectations bridges the gap between books on probability theory and statistics by providing the probabilistic concepts estimated and tested in analysis of variance, regression analysis, factor analysis, structural equation modeling, hierarchical linear models and analysi...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Otros Autores: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Chichester, West Sussex :
John Wiley & Sons, Inc.,
2017.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Intro
- Probability and Conditional Expectation
- Contents
- Preface
- Why another book on probability?
- What is it about?
- For whom is it?
- Prerequisites
- Acknowledgements
- About the companion website
- Part I Measure-theoretical foundations of probability theory
- 1 Measure
- 1.1 Introductory examples
- 1.2 -Algebra and measurable space
- 1.2.1 -Algebra generated by a set system
- 1.2.2 -Algebra of Borel sets on
- 1.2.3 -Algebra on a Cartesian product
- 1.2.4)"Stable set systems that generate a -algebra
- 1.3 Measure and measure space
- 1.3.1 -Additivity and related properties
- 1.3.2 Other properties
- 1.4 Specific measures
- 1.4.1 Dirac measure and counting measure
- 1.4.2 Lebesgue measure
- 1.4.3 Other examples of a measure
- 1.4.4 Finite and -finite measures
- 1.4.5 Product measure
- 1.5 Continuity of a measure
- 1.6 Specifying a measure via a generating system
- 1.7 -Algebra that is trivial with respect to a measure
- 1.8 Proofs
- 2 Measurable mapping
- 2.1 Image and inverse image
- 2.2 Introductory examples
- 2.2.1 Example 1: Rectangles
- 2.2.2 Example 2: Flipping two coins
- 2.3 Measurable mapping
- 2.3.1 Measurable mapping
- 2.3.2 -Algebra generated by a mapping
- 2.3.3 Final -algebra
- 2.3.4 Multivariate mapping
- 2.3.5 Projection mapping
- 2.3.6 Measurability with respect to a mapping
- 2.4 Theorems on measurable mappings
- 2.4.1 Measurability of a composition
- 2.4.2 Theorems on measurable functions
- 2.5 Equivalence of two mappings with respect to a measure
- 2.6 Image measure
- 2.7 Proofs
- 3 Integral
- 3.1 Definition
- 3.1.1 Integral of a nonnegative step function
- 3.1.2 Integral of a nonnegative measurable function
- 3.1.3 Integral of a measurable function
- 3.2 Properties
- 3.2.1 Integral of -equivalent functions.
- 3.2.2 Integral with respect to a weighted sum of measures
- 3.2.3 Integral with respect to an image measure
- 3.2.4 Convergence theorems
- 3.3 Lebesgue and Riemann integral
- 3.4 Density
- 3.5 Absolute continuity and the Radon-Nikodym theorem
- 3.6 Integral with respect to a product measure
- 3.7 Proofs
- Part II Probability, Random Variable, and Its Distribution
- 4 Probability measure
- 4.1 Probability measure and probability space
- 4.1.1 Definition
- 4.1.2 Formal and substantive meaning of probabilistic terms
- 4.1.3 Properties of a probability measure
- 4.1.4 Examples
- 4.2 Conditional probability
- 4.2.1 Definition
- 4.2.2 Filtration and time order between events and sets of events
- 4.2.3 Multiplication rule
- 4.2.4 Examples
- 4.2.5 Theorem of total probability
- 4.2.6 Bayes' theorem
- 4.2.7 Conditional-probability measure
- 4.3 Independence
- 4.3.1 Independence of events
- 4.3.2 Independence of set systems
- 4.4 Conditional independence given an event
- 4.4.1 Conditional independence of events given an event
- 4.4.2 Conditional independence of set systems given an event
- 4.5 Proofs
- 5 Random variable, distribution, density, and distribution function
- 5.1 Random variable and its distribution
- 5.2 Equivalence of two random variables with respect to a probability measure
- 5.2.1 Identical and P-equivalent random variables
- 5.2.2 P-equivalence, PB-equivalence, and absolute continuity
- 5.3 Multivariate random variable
- 5.4 Independence of random variables
- 5.5 Probability function of a discrete random variable
- 5.6 Probability density with respect to a measure
- 5.6.1 General concepts and properties
- 5.6.2 Density of a discrete random variable
- 5.6.3 Density of a bivariate random variable
- 5.7 Uni- or multivariate real-valued random variable.
- 5.7.1 Distribution function of a univariate real-valued random variable
- 5.7.2 Distribution function of a multivariate real-valued random variable
- 5.7.3 Density of a continuous univariate real-valued random variable
- 5.7.4 Density of a continuous multivariate real-valued random variable
- 5.8 Proofs
- 6 Expectation, variance, and other moments
- 6.1 Expectation
- 6.1.1 Definition
- 6.1.2 Expectation of a discrete random variable
- 6.1.3 Computing the expectation using a density
- 6.1.4 Transformation theorem
- 6.1.5 Rules of computation
- 6.2 Moments, variance, and standard deviation
- 6.3 Proofs
- 7 Linear quasi-regression, covariance, and correlation
- 7.1 Linear quasi-regression
- 7.2 Covariance
- 7.3 Correlation
- 7.4 Expectation vector and covariance matrix
- 7.4.1 Random vector and random matrix
- 7.4.2 Expectation of a random vector and a random matrix
- 7.4.3 Covariance matrix of two multivariate random variables
- 7.5 Multiple linear quasi-regression
- 7.6 Proofs
- 8 Some distributions
- 8.1 Some distributions of discrete random variables
- 8.1.1 Discrete uniform distribution
- 8.1.2 Bernoulli distribution
- 8.1.3 Binomial distribution
- 8.1.4 Poisson distribution
- 8.1.5 Geometric distribution
- 8.2 Some distributions of continuous random variables
- 8.2.1 Continuous uniform distribution
- 8.2.2 Normal distribution
- 8.2.3 Multivariate normal distribution
- 8.2.4 Central 2-distribution
- 8.2.5 Central t-distribution
- 8.2.6 Central F-distribution
- 8.3 Proofs
- Part III Conditional expectation and regression
- 9 Conditional expectation value and discrete conditional expectation
- 9.1 Conditional expectation value
- 9.2 Transformation theorem
- 9.3 Other properties
- 9.4 Discrete conditional expectation
- 9.5 Discrete regression
- 9.6 Examples
- 9.7 Proofs
- 10 Conditional expectation.
- 10.1 Assumptions and definitions
- 10.2 Existence and uniqueness
- 10.2.1 Uniqueness with respect to a probability measure
- 10.2.2 A necessary and sufficient condition of uniqueness
- 10.2.3 Examples
- 10.3 Rules of computation and other properties
- 10.3.1 Rules of computation
- 10.3.2 Monotonicity
- 10.3.3 Convergence theorems
- 10.4 Factorization, regression, and conditional expectation value
- 10.4.1 Existence of a factorization
- 10.4.2 Conditional expectation and mean squared error
- 10.4.3 Uniqueness of a factorization
- 10.4.4 Conditional expectation value
- 10.5 Characterizing a conditional expectation by the joint distribution
- 10.6 Conditional mean independence
- 10.7 Proofs
- 11 Residual, conditional variance, and conditional covariance
- 11.1 Residual with respect to a conditional expectation
- 11.2 Coefficient of determination and multiple correlation
- 11.3 Conditional variance and covariance given a -algebra
- 11.4 Conditional variance and covariance given a value of a random variable
- 11.5 Properties of conditional variances and covariances
- 11.6 Partial correlation
- 11.7 Proofs
- 12 Linear regression
- 12.1 Basic ideas
- 12.2 Assumptions and definitions
- 12.3 Examples
- 12.4 Linear quasi-regression
- 12.5 Uniqueness and identification of regression coefficients
- 12.6 Linear regression
- 12.7 Parameterizations of a discrete conditional expectation
- 12.8 Invariance of regression coefficients
- 12.9 Proofs
- 13 Linear logistic regression
- 13.1 Logit transformation of a conditional probability
- 13.2 Linear logistic parameterization
- 13.3 A parameterization of a discrete conditional probability
- 13.4 Identification of coefficients of a linear logistic parameterization
- 13.5 Linear logistic regression and linear logit regression
- 13.6 Proofs.