Random processes for image and signal processing /
Part of the SPIE/IEEE Series on Imaging Science and Engineering. This book provides a framework for understanding the ensemble of temporal, spatial, and higher-dimensional processes in science and engineering that vary randomly in observations. Suitable as a text for undergraduate and graduate stude...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Bellingham, Wash. : New York :
SPIE Optical Engineering Press ; Institute of Electrical and Electronics Engineers,
©1999.
|
Colección: | SPIE/IEEE series on imaging science & engineering.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- Chapter 1. Probability theory
- Probability space
- Events
- Conditional probability
- Random variables
- Probability distributions
- Probability densities
- Functions of a random variable
- Moments
- Expectation and variance
- Moment-generating function
- Important probability distributions
- Binomial distribution
- Poisson distribution
- Normal distribution
- Gamma distribution
- Beta distribution
- Computer simulation
- Multivariate distributions
- Jointly distributed random variables
- Conditioning
- Independence
- Functions of several random variables
- Basic arithmetic functions of two random variables
- Distributions of sums of independent random variables
- Joint distributions of output random variables
- Expectation of a function of several random variables
- Covariance
- Multivariate normal distribution
- Laws of large numbers
- Weak law of large numbers
- Strong law of large numbers
- Central limit theorem
- Parametric estimation via random samples
- Random-sample estimators
- Sample mean and sample variance
- Minimum-variance unbiased estimators
- Method of moments
- Order statistics
- Maximum-likelihood estimation
- Maximum-likelihood estimators
- Additive noise
- Minimum noise
- Entropy
- Uncertainty
- Information
- Entropy of a random vector
- Source coding
- Prefix codes
- Optimal coding
- Exercises for chapter 1.
- Chapter 2. Random processes
- Random functions
- Moments of a random function
- Mean and covariance functions
- Mean and covariance of a sum
- Differentiation
- Differentiation of random functions
- Mean-square differentiability
- Integration
- Mean ergodicity
- Poisson process
- One-dimensional Poisson model
- Derivative of the Poisson process
- Properties of Poisson points
- Axiomatic formulation of the Poisson process
- Wiener process and white noise
- White noise
- Random walk
- Wiener process
- Stationarity
- Wide-sense stationarity
- Mean-ergodicity for WS stationary processes
- Covariance-ergodicity for WS stationary processes
- Strict-sense stationarity
- Estimation
- Linear systems
- Communication of a linear operator with expectation
- Representation of linear operators
- Output covariance
- Exercises for chapter 2.
- Chapter 3. Canonical representation
- Canonical expansions
- Fourier representation and projections
- Expansion of the covariance function
- Karhunen-Loeve expansion
- The Karhunen-Loeve theorem
- Discrete Karhunen-Loeve expansion
- Canonical expansions with orthonormal coordinate functions
- Relation to data compression
- Noncanonical representation
- Generalized Bessel inequality
- Decorrelation
- Trigonometric representation
- Trigonometric Fourier series
- Generalized Fourier coefficients for WS stationary processes
- Mean-square periodic WS stationary processes
- Expansions as transforms
- Orthonormal transforms of random functions
- Fourier descriptors
- Transform coding
- Karhunen-Loeve compression
- Transform compression using arbitrary orthonormal systems
- Walsh-Hadamard transform
- Discrete cosine transform
- Transform coding for digital images
- Optimality of the Karhunen-Loeve transform
- Coefficients generated by linear functionals
- Coefficients from integral functionals
- Generating bi-orthogonal function systems
- Complete function systems
- Canonical expansion of the covariance function
- Canonical expansions from covariance expansions
- Constructing canonical expansions for covariance functions
- Integral canonical expansions
- Construction via integral functional coefficients
- Construction from a covariance expansion
- Power spectral density
- The power-spectral-density/autocorrelation transform pair
- Power spectral density and linear operators
- Integral representation of WS stationary random functions
- Canonical representation of vector random functions
- Vector random functions
- Canonical expansions for vector random functions
- Finite sets of random vectors
- Canonical representation over a discrete set
- Exercises for chapter 3.
- Chapter 4. Optimal filtering
- Optimal mean-square-error filters
- Conditional expectation
- Optimal nonlinear filter
- Optimal filter for jointly normal random variables
- Multiple observation variables
- Bayesian parametric estimation
- Optimal finite-observation linear filters
- Linear filters and the orthogonality principle
- Design of the optimal linear filter
- Optimal linear filter in the jointly Gaussian case
- Role of wide-sense stationarity
- Signal-plus-noise model
- Edge detection
- Steepest descent
- Steepest descent iterative algorithm
- Convergence of the steepest-descent algorithm
- Least-mean-square adaptive algorithm
- Convergence of the LMS algorithm
- Nonstationary processes
- Least-squares estimation
- Pseudoinverse estimator
- Least-squares estimation for nonwhite noise
- Multiple linear regression
- Least-squares image restoration
- Optimal linear estimation of random vectors
- Optimal linear filter for linearly dependent observations
- Optimal estimation of random vectors
- Optimal linear filters for random vectors
- Recursive linear filters
- Recursive generation of direct sums
- Static recursive optimal linear filtering
- Dynamic recursive optimal linear filtering
- Optimal infinite-observation linear filters
- Wiener-Hopf equation
- Wiener filter
- Optimal linear filter in the context of a linear model
- The linear signal model
- Procedure for finding the optimal linear filter
- Additive white noise
- Discrete domains
- Optimal linear filters via canonical expansions
- Integral decomposition into white noise
- Integral equations involving the autocorrelation function
- Solution via discrete canonical expansions
- Optimal binary filters
- Binary conditional expectation
- Boolean functions and optimal translation-invariant filters
- Optimal increasing filters
- Pattern classification
- Optimal classifiers
- Gaussian maximum-likelihood classification
- Linear discriminants
- Neural networks
- Two-layer neural networks
- Steepest descent for nonquadratic error surfaces
- Sum-of-squares error
- Error back-propagation
- Error back-propagation for multiple outputs
- Adaptive network design
- Exercises for chapter 4.
- Chapter 5. Random models
- Markov chains
- Chapman-Kolmogorov equations
- Transition probability matrix
- Markov processes
- Steady-state distributions for discrete-time Markov chains
- Long-run behavior of a two-state Markov chain
- Classification of states
- Steady-state and stationary distributions
- Long-run behavior of finite Markov chains
- Long-run behavior of Markov chains with infinite state spaces
- Steady-state distributions for continuous-time Markov chains
- Irreducible continuous-time Markov chains
- Birth-death model-queues
- Forward and backward Kolmogorov equations
- Markov random fields
- Neighborhood systems
- Determination by conditional probabilities
- Gibbs distributions
- Random Boolean model
- Germ-grain model
- Vacancy
- Hitting
- Linear boolean model
- Granulometries
- Openings
- Classification by granulometric moments
- Adaptive reconstructive openings
- Random sets
- Hit-or-miss topology
- Convergence and continuity
- Random closed sets
- Capacity functional
- Exercises for chapter 5
- Bibliography
- Index.