A signal theoretic introduction to random processes /
A fresh introduction to random processes utilizing signal theory By incorporating a signal theory basis, A Signal Theoretic Introduction to Random Processes presents a unique introduction to random processes with an emphasis on the important random phenomena encountered in the electronic and communi...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Hoboken, New Jersey :
John Wiley & Sons, Inc.,
[2015]
|
Temas: | |
Acceso en línea: | Texto completo (Requiere registro previo con correo institucional) |
Tabla de Contenidos:
- Title Page
- Copyright Page
- About the Author
- Contents
- Preface
- Chapter 1 A Signal Theoretic Introduction to Random Processes
- 1.1 INTRODUCTION
- 1.2 MOTIVATION
- 1.2.1 Usefulness of Randomness
- 1.2.2 Engineering
- 1.3 BOOK OVERVIEW
- Chapter 2 Background: Mathematics
- 2.1 INTRODUCTION
- 2.2 SET THEORY
- 2.2.1 Basic Definitions
- 2.2.2 Infinity
- 2.2.3 Supremum and Infimum
- 2.3 FUNCTION THEORY
- 2.3.1 Function Definition
- 2.3.2 Common Functions
- 2.3.3 Function Properties
- 2.4 MEASURE THEORY
- 2.4.1 Sigma Algebra
- 2.4.2 Measure
- 2.4.3 Lebesgue Measure
- 2.5 MEASURABLE FUNCTIONS
- 2.5.1 Simple or Elementary Functions
- 2.6 LEBESGUE INTEGRATION
- 2.6.1 The Lebesgue Integral
- 2.6.2 Demarcation of Signal Space
- 2.6.3 Miscellaneous Results
- 2.7 CONVERGENCE
- 2.7.1 Dominated and Monotone Convergence
- 2.8 LEBESGUE-STIELTJES MEASURE
- 2.8.1 Lebesgue-Stieltjes Measure: Monotonic Function Case
- 2.8.2 Lebesgue-Stieltjes Measure: Decreasing Function
- 2.8.3 Lebesgue-Stieltjes Measure: General Case
- 2.9 LEBESGUE-STIELTJES INTEGRATION
- 2.9.1 Motivation
- 2.9.2 Lebesgue-Stieltjes Integral
- 2.9.3 Lebesgue-Stieltjes Integrals: Specific Cases
- 2.10 MISCELLANEOUS RESULTS
- 2.11 PROBLEMS
- APPENDIX 2.A PROOF OF THEOREM 2.1
- APPENDIX 2.B PROOF OF THEOREM 2.2
- APPENDIX 2.C PROOF OF THEOREM 2.7
- APPENDIX 2.D PROOF OF THEOREM 2.8
- APPENDIX 2.E PROOF OF THEOREM 2.10
- Chapter 3 Background: Signal Theory
- 3.1 INTRODUCTION
- 3.2 SIGNAL ORTHOGONALITY
- 3.2.1 Signal Decomposition
- 3.2.2 Generalization
- 3.2.3 Example: Hermite Basis Set
- 3.3 THEORY FOR DIRICHLET POINTS
- 3.3.1 Existence of Dirichlet Points
- 3.4 DIRAC DELTA
- 3.5 FOURIER THEORY
- 3.5.1 Fourier Series
- 3.5.2 Fourier Transform
- 3.5.3 Inverse Fourier Transform
- 3.5.4 Parsevalś Theorem
- 3.6 SIGNAL POWER.
- 3.6.1 Sinusoidal Basis Set
- 3.6.2 Arbitrary Basis Set
- 3.7 THE POWER SPECTRAL DENSITY
- 3.7.1 Energy Spectral Density
- 3.7.2 Power Spectral Density: Sinusoidal Basis Set
- 3.8 THE AUTOCORRELATION FUNCTION
- 3.8.1 Definition of the Autocorrelation Function
- 3.9 POWER SPECTRAL DENSITY-AUTOCORRELATION FUNCTION
- 3.9.1 Relationships for Alternative Autocorrelation Function
- 3.10 RESULTS FOR THE INFINITE INTERVAL
- 3.10.1 Average Power
- 3.10.2 The Power Spectral Density
- 3.10.3 Integrated Spectrum
- 3.10.4 Time Averaged Autocorrelation Function
- 3.10.5 Power Spectral Density-Autocorrelation Relationship
- 3.11 CONVERGENCE OF FOURIER COEFFICIENTS
- 3.11.1 Periodic Signal Case
- 3.11.2 Convergence of Fourier Coefficients to Zero
- 3.12 Cramerś Representation and Transform
- 3.12.1 Miscellaneous Mathematical Results
- 3.12.2 Cramer Representation and Transform
- 3.12.3 Initial Approach to the Cramer Transform
- 3.12.4 The Cramer Transform
- 3.12.5 Miscellaneous Results
- 3.12.6 Transform of Common Signals
- 3.12.7 Change in Transform
- 3.12.8 Linear Filtering
- 3.12.9 Integrated Spectrum, Spectrum, and Power Spectrum
- 3.12.10 Cramer Transform of Standard Signals
- 3.13 PROBLEMS
- APPENDIX 3.A PROOF OF THEOREM 3.5
- APPENDIX 3.B PROOF OF THEOREM 3.8
- APPENDIX 3.C FOURIER TRANSFORM AND PSD OF A SINUSOID
- APPENDIX 3.D PROOF OF THEOREM 3.14
- APPENDIX 3.E PROOF OF THEOREM 3.19
- APPENDIX 3.F PROOF OF THEOREM 3.23
- APPENDIX 3.G PROOF OF THEOREM 3.24
- APPENDIX 3.H PROOF OF THEOREM 3.25
- APPENDIX 3.I PROOF OF THEOREM 3.26
- APPENDIX 3.J CRAMER TRANSFORM OF UNIT STEP FUNCTION
- APPENDIX 3.K CRAMER TRANSFORM FOR SINUSOIDAL SIGNALS
- 3.K.1 Complex Exponential Case
- 3.K.2 Sine and Cosine on [−T,T]
- APPENDIX 3.L PROOF OF THEOREM 3.30
- APPENDIX 3.M PROOF OF THEOREM 3.31
- APPENDIX 3.N PROOF OF THEOREM 3.32.
- APPENDIX 3.O PROOF OF THEOREM 3.33
- Chapter 4 Background: Probability and Random Variable Theory
- 4.1 INTRODUCTION
- 4.2 BASIC CONCEPTS: EXPERIMENTS-PROBABILITY THEORY
- 4.2.1 Experiments and Sample Spaces
- 4.2.2 Events
- 4.2.3 Probability of an Event
- 4.2.4 Conditional Probability
- 4.2.5 Independent Events
- 4.2.6 Countable and Uncountable Sample Spaces
- 4.3 THE RANDOM VARIABLE
- 4.3.1 Notes
- 4.3.2 Sample Spaces for Random Variables
- 4.3.3 Random Variable Based on Experimental Outcomes
- 4.4 DISCRETE AND CONTINUOUS RANDOM VARIABLES
- 4.4.1 Discrete Random Variables
- 4.4.2 Continuous Random Variables
- 4.4.3 Cumulative Distribution Function
- 4.5 STANDARD RANDOM VARIABLES
- 4.6 FUNCTIONS OF A RANDOM VARIABLE
- 4.7 EXPECTATION
- 4.7.1 Mean, Variance, and Moments of a Random Variable
- 4.7.2 Expectation of a Function of a Random Variable
- 4.7.3 Characteristic Function
- 4.8 GENERATION OF DATA CONSISTENT WITH DEFINED PDF
- 4.8.1 Example
- 4.9 VECTOR RANDOM VARIABLES
- 4.9.1 Random Variable Defined Based on Experimental Outcomes
- 4.9.2 Vector Random Variables
- 4.9.3 Sample Space for Vector Random Variable
- 4.10 PAIRS OF RANDOM VARIABLES
- 4.10.1 Notation
- 4.10.2 Joint Cumulative Distribution Function (Joint CDF)
- 4.10.3 Joint Probability Mass Function
- 4.10.4 Marginal Probability Mass Function
- 4.10.5 Joint Probability Density Function
- 4.10.6 Marginal Distribution and Density Functions
- 4.10.7 Linearity of Expectation Operator
- 4.10.8 Conditional Mass and Density Functions
- 4.11 COVARIANCE AND CORRELATION
- 4.11.1 Understanding Covariance
- 4.11.2 Uncorrelatedness
- 4.11.3 The Correlation Coefficient
- 4.12 SUMS OF RANDOM VARIABLES
- 4.12.1 Sum of Gaussian Random Variables
- 4.12.2 Difficulty in Determining the PDF of a Sum of Random Variables
- 4.13 JOINTLY GAUSSIAN RANDOM VARIABLES.
- 4.14 STIRLINGŚ FORMULA AND APPROXIMATIONS TO BINOMIAL
- 4.14.1 Binomial Probability Mass Function
- 4.14.2 Stirlingś Formula
- 4.14.3 DeMoivre-Laplace Theorem
- 4.14.4 Poisson Approximation to Binomial
- 4.15 PROBLEMS
- APPENDIX 4.A PROOF OF THEOREM 4.6
- APPENDIX 4.B PROOF OF THEOREM 4.8
- APPENDIX 4.C PROOF OF THEOREM 4.9
- APPENDIX 4.D PROOF OF THEOREM 4.21
- APPENDIX 4.E PROOF OF STIRLING'S FORMULA
- APPENDIX 4.F PROOF OF THEOREM 4.27
- 4.F.1 Relative Error
- APPENDIX 4.G PROOF OF THEOREM 4.29
- Chapter 5 Introduction to Random Processes
- 5.1 RANDOM PROCESSES
- 5.2 DEFINITION OF A RANDOM PROCESS
- 5.2.1 Notation
- 5.3 EXAMPLES OF RANDOM PROCESSES
- 5.3.1 On-Off Sinusoid
- 5.3.2 Sinusoid with Random Phase
- 5.3.3 Sinusoid with Random Amplitude
- 5.3.4 Amplitude and Frequency Modulation
- 5.3.5 Binary Digital Communication Signalling
- 5.4 EXPERIMENTS AND EXPERIMENTAL OUTCOMES
- 5.4.1 Experiments and Subexperiments
- 5.4.2 Specifying Sample Spaces: Single-Vector-Matrix Cases
- 5.5 PROTOTYPICAL EXPERIMENTS
- 5.5.1 Bernoulli Experiment
- 5.5.2 Poisson Experiment
- 5.5.3 Experiments with an Infinite Number of Subexperiments
- 5.6 RANDOM VARIABLES DEFINED BY A RANDOM PROCESS
- 5.7 CLASSIFICATION OF RANDOM PROCESSES
- 5.7.1 One-Dimensional Random Processes
- 5.7.2 Two-Dimensional Random Processes
- 5.7.3 Higher-Dimensional Random Processes
- 5.8 CLASSIFICATION: ONE-DIMENSIONAL RPs
- 5.8.1 Classification According to State
- 5.9 SUMS OF RANDOM PROCESSES
- 5.10 PROBLEMS
- Chapter 6 Prototypical Random Processes
- 6.1 INTRODUCTION
- 6.2 BERNOULLI RANDOM PROCESSES
- 6.2.1 Generalized Bernoulli Random Processes
- 6.2.2 Random Walk
- 6.3 POISSON RANDOM PROCESSES
- 6.3.1 Poisson Point Process
- 6.3.2 Poisson Counting Process
- 6.3.3 Shot Noise Random Process
- 6.3.4 Generalized Shot Noise.
- 6.3.5 Causal Random Telegraph Signal
- 6.3.6 Random Telegraph Signal: Type II
- 6.4 CLUSTERED RANDOM PROCESSES
- 6.4.1 Experiment for a Clustered Random Process
- 6.4.2 Clustered Point Random Process
- 6.4.3 Clustered Random Process
- 6.5 SIGNALLING RANDOM PROCESSES
- 6.5.1 Signalling Random Processes
- 6.5.2 Generalized Signalling Random Process
- 6.6 JITTER
- 6.6.1 Jittered Pulse Train
- 6.7 WHITE NOISE
- 6.7.1 White Noise: Approach I
- 6.7.2 White Noise: Approach II
- 6.7.3 Gaussian White Noise
- 6.7.4 Filtered White Noise
- 6.8 1/f NOISE
- 6.8.1 Model for 1/f Noise
- 6.8.2 Example
- 6.9 BIRTH-DEATH RANDOM PROCESSES
- 6.9.1 Experiment Underpinning Birth-Death Random Process
- 6.9.2 Birth-Death Random Processes
- 6.10 ORTHOGONAL INCREMENT RANDOM PROCESSES
- 6.10.1 Increment Function
- 6.10.2 Definition: Orthogonal Increment Random Process
- 6.10.3 Examples of Orthogonal Increment Random Processes
- 6.11 LINEAR FILTERING OF RANDOM PROCESSES
- 6.12 SUMMARY OF RANDOM PROCESSES
- 6.13 PROBLEMS
- APPENDIX 6.A PROOF OF THEOREM 6.4
- Chapter 7 Characterizing Random Processes
- 7.1 INTRODUCTION
- 7.1.1 Notation for One-Dimensional Random Processes
- 7.1.2 Associated Random Processes
- 7.2 TIME EVOLUTION OF PMF OR PDF
- 7.3 FIRST-, SECOND-, AND HIGHER-ORDER CHARACTERIZATION
- 7.3.1 First-Order Characterization
- 7.3.2 Second-Order Characterization
- 7.3.3 Nth-Order Characterization
- 7.3.4 Mean and Variance and Average Power
- 7.3.5 Transient, Steady-State, Periodic, and Aperiodic Random Processes
- 7.4 AUTOCORRELATION AND POWER SPECTRAL DENSITY
- 7.4.1 Definitions for Individual Signals
- 7.4.2 Definitions for Autocorrelation and PSD
- 7.4.3 Simplified Notation: Countable Signal Sample Space Case
- 7.4.4 Notation: Vector Case
- 7.4.5 Infinite Interval Case
- 7.4.6 Existence: Finite Interval.