Analyzing neural time series data : theory and practice /
"This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG),...
Clasificación: | Libro Electrónico |
---|---|
Autor principal: | |
Formato: | Electrónico eBook |
Idioma: | Inglés |
Publicado: |
Cambridge, Massachusetts :
The MIT Press,
[2014]
|
Colección: | Issues in clinical and cognitive neuropsychology.
|
Temas: | |
Acceso en línea: | Texto completo |
Tabla de Contenidos:
- pt. I Introduction
- 1. The Purpose of This Book, Who Should Read It, and How to Use It
- 1.1. What Is Cognitive Electrophysiology?
- 1.2. What Is the Purpose of This Book?
- 1.3. Why Shouldn't You Use <Insert Name of M/EEG Software Analysis Package>?
- 1.4. Why Program Analyses, and Why in Matlab?
- 1.5. How Best to Learn from and Use This Book
- 1.6. Sample Data and Online Code
- 1.7. Terminology Used in This Book
- 1.8. Exercises
- 1.9. Is Everything There Is to Know about EEG Analyses in This Book?
- 1.10. Who Should Read This Book?
- 1.11. Is This Book Difficult?
- 1.12. Questions?
- 2. Advantages and Limitations of Time and Time-Frequency-Domain Analyses
- 2.1. Why EEG?
- 2.2. Why Not EEG?
- 2.3. Interpreting Voltage Values from the EEG Signal
- 2.4. Advantages of Event-Related Potentials
- 2.5. Limitations of ERPs
- 2.6. Advantages of Time-Frequency-Based Approaches
- 2.7. Limitations of Time-Frequency-Based Approaches.
- 2.8. Temporal Resolution, Precision, and Accuracy of EEG
- 2.9. Spatial Resolution, Precision, and Accuracy of EEG
- 2.10. Topographical Localization versus Brain Localization
- 2.11. EEG or MEG?
- 2.12. Costs of EEG Research
- 3. Interpreting and Asking Questions about Time-Frequency Results
- 3.1. EEG Time-Frequency: The Basics
- 3.2. Ways to View Time-Frequency Results
- 3.3. Tfviewerx and erpviewerx
- 3.4. How to View and Interpret Time-Frequency Results
- 3.5. Things to Be Suspicious of When Viewing Time-Frequency Results
- 3.6. Do Results in Time-Frequency Plots Mean That There Were Neural Oscillations?
- 4. Introduction to Matlab Programming
- 4.1. Write Clean and Efficient Code
- 4.2. Use Meaningful File and Variable Names
- 4.3. Make Regular Backups of Your Code and Keep Original Copies of Modified Code
- 4.4. Initialize Variables
- 4.5. Help!
- 4.6. Be Patient and Embrace the Learning Experience
- 4.7. Exercises.
- 5. Introduction to the Physiological Bases of EEG
- 5.1. Biophysical Events That Are Measurable with EEG
- 5.2. Neurobiological Mechanisms of Oscillations
- 5.3. Phase-Locked, Time-Locked, Task-Related
- 5.4. Neurophysiological Mechanisms of ERPs
- 5.5. Are Electrical Fields Causally Involved in Cognition?
- 5.6. What if Electrical Fields Are Not Causally Involved in Cognition?
- 6. Practicalities of EEG Measurement and Experiment Design
- 6.1. Designing Experiments: Discuss, Pilot, Discuss, Pilot
- 6.2. Event Markers
- 6.3. Intra- and Intertrial Timing
- 6.4. How Many Trials You Will Need
- 6.5. How Many Electrodes You Will Need
- 6.6. Which Sampling Rate to Use When Recording Data
- 6.7. Other Optional Equipment to Consider
- pt. II Preprocessing and Time-Domain Analyses
- 7. Preprocessing Steps Necessary and Useful for Advanced Data Analysis
- 7.1. What Is Preprocessing?
- 7.2. The Balance between Signal and Noise
- 7.3. Creating Epochs.
- 7.4. Matching Trial Count across Conditions
- 7.5. Filtering
- 7.6. Trial Rejection
- 7.7. Spatial Filtering
- 7.8. Referencing
- 7.9. Interpolating Bad Electrodes
- 7.10. Start with Clean Data
- 8. EEG Artifacts: Their Detection, Influence, and Removal
- 8.1. Removing Data Based on Independent Components Analysis
- 8.2. Removing Trials because of Blinks
- 8.3. Removing Trials because of Oculomotor Activity
- 8.4. Removing Trials Based on EMG in EEG Channels
- 8.5. Removing Trials Based on Task Performance
- 8.6. Removing Trials Based on Response Hand EMG
- 8.7. Train Subjects to Minimize Artifacts
- 8.8. Minimize Artifacts during Data Collection
- 9. Overview of Time-Domain EEG Analyses
- 9.1. Event-Related Potentials
- 9.2. Filtering ERPs
- 9.3. Butterfly Plots and Global Field Power/Topographical Variance Plots
- 9.4. The Flicker Effect
- 9.5. Topographical Maps
- 9.6. Microstates
- 9.7. ERP Images
- 9.8. Exercises.
- pt. III Frequency and Time-Frequency Domains Analyses
- 10. The Dot Product and Convolution
- 10.1. Dot Product
- 10.2. Convolution
- 10.3. How Does Convolution Work?
- 10.4. Convolution versus Cross-Covariance
- 10.5. The Purpose of Convolution for EEG Data Analyses
- 10.6. Exercises
- 11. The Discrete Time Fourier Transform, the FFT, and the Convolution Theorem
- 11.1. Making Waves
- 11.2. Finding Waves in EEG Data with the Fourier Transform
- 11.3. The Discrete Time Fourier Transform
- 11.4. Visualizing the Results of a Fourier Transform
- 11.5.Complex Results and Negative Frequencies
- 11.6. Inverse Fourier Transform
- 11.7. The Fast Fourier Transform
- 11.8. Stationarity and the Fourier Transform
- 11.9. Extracting More or Fewer Frequencies than Data Points
- 11.10. The Convolution Theorem
- 11.11. Tips for Performing FFT-Based Convolution in Matlab
- 11.12. Exercises
- 12. Morlet Wavelets and Wavelet Convolution
- 12.1. Why Wavelets?
- 12.2. How to Make Wavelets
- 12.3. Wavelet Convolution as a Bandpass Filter
- 12.4. Limitations of Wavelet Convolution as Discussed Thus Far
- 12.5. Exercises
- 13.Complex Morlet Wavelets and Extracting Power and Phase
- 13.1. The Wavelet Complex
- 13.2. Imagining the Imaginary
- 13.3. Rectangular and Polar Notation and the Complex Plane
- 13.4. Euler's Formula
- 13.5. Euler's Formula and the Result of Complex Wavelet Convolution
- 13.6. From Time Point to Time Series
- 13.7. Parameters of Wavelets and Recommended Settings
- 13.8. Determining the Frequency Smoothing of Wavelets
- 13.9. Tips for Writing Efficient Convolution Code in Matlab
- 13.10. Describing This Analysis in Your Methods Section
- 13.11. Exercises
- 14. Bandpass Filtering and the Hilbert Transform
- 14.1. Hilbert Transform
- 14.2. Filtering Data before Applying the Hilbert Transform
- 14.3. Finite versus Infinite Impulse Response Filters
- 14.4. Bandpass, Band-Stop, High-Pass, Low-Pass.
- 14.5. Constructing a Filter
- 14.6. Check Your Filters
- 14.7. Applying the Filter to Data
- 14.8. Butterworth (IIR) Filter
- 14.9. Filtering Each Trial versus Filtering Concatenated Trials
- 14.10. Multiple Frequencies
- 14.11.A World of Filters
- 14.12. Describing This Analysis in Your Methods Section
- 14.13. Exercises
- 15. Short-Time FFT
- 15.1. How the Short-Time FFT Works
- 15.2. Taper the Time Series
- 15.3. Time Segment Lengths and Overlap
- 15.4. Power and Phase
- 15.5. Describing This Analysis in Your Methods Section
- 15.6. Exercises
- 16. Multitapers
- 16.1. How the Multitaper Method Works
- 16.2. The Tapers
- 16.3. When You Should and Should Not Use Multitapers
- 16.4. The Multitaper Framework and Advanced Topics
- 16.5. Describing This Analysis in Your Methods Section
- 16.6. Exercises
- 17. Less Commonly Used Time-Frequency Decomposition Methods
- 17.1. Autoregressive Modeling
- 17.2. Hilbert-Huang (Empirical Mode Decomposition).
- 17.3. Matching Pursuit
- 17.4.P-Episode
- 17.5.S-Transform
- 18. Time-Frequency Power and Baseline Normalizations
- 18.1.1/f Power Scaling
- 18.2. The Solution to 1/f Power in Task Designs
- 18.3. Decibel Conversion
- 18.4. Percentage Change and Baseline Division
- 18.5.Z-Transform
- 18.6. Not All Transforms Are Equal
- 18.7. Other Transforms
- 18.8. Mean versus Median
- 18.9. Single-Trial Baseline Normalization
- 18.10. The Choice of Baseline Time Window
- 18.11. Disadvantages of Baseline-Normalized Power
- 18.12. Signal-to-Noise Estimates
- 18.13. Number of Trials and Power Estimates
- 18.14. Downsampling Results after Analyses
- 18.15. Describing This Analysis in Your Methods Section
- 18.16. Exercises
- 19. Intertrial Phase Clustering
- 19.1. Why Phase Values Cannot Be Averaged
- 19.2. Intertrial Phase Clustering
- 19.3. Strength in Numbers
- 19.4. Using ITPC When There Are Few Trials or Condition Differences in Trial Count.
- 19.5. Effects of Temporal Jitter on ITPC and Power
- 19.6. ITPC and Power
- 19.7. Weighted ITPC
- 19.8. Multimodal Phase Distributions
- 19.9. Spike-Field Coherence
- 19.10. Describing This Analysis in Your Methods Section
- 19.11. Exercises
- 20. Differences among Total, Phase-Locked, and Non-Phase-Locked Power and Intertrial Phase Consistency
- 20.1. Total Power
- 20.2. Non-Phase-Locked Power
- 20.3. Phase-Locked Power
- 20.4. ERP Time-Frequency Power
- 20.5. Intertrial Phase Clustering
- 20.6. When to Use What Approach
- 20.7. Exercise
- 21. Interpretations and Limitations of Time-Frequency Power and ITPC Analyses
- 21.1. Terminology
- 21.2. When to Use What Time-Frequency Decomposition Method
- 21.3. Interpreting Time-Frequency Power
- 21.4. Interpreting Time-Frequency Intertrial Phase Clustering
- 21.5. Limitations of Time-Frequency Power and Intertrial Phase Clustering
- 21.6. Do Time-Frequency Analyses Reveal Neural Oscillations?
- pt. IV Spatial Filters
- 22. Surface Laplacian
- 22.1. What Is the Surface Laplacian?
- 22.2. Algorithms for Computing the Surface Laplacian for EEG Data
- 22.3. Surface Laplacian for Topographical Localization
- 22.4. Surface Laplacian for Connectivity Analyses
- 22.5. Surface Laplacian for Cleaning Topographical Noise
- 22.6. Describing This Analysis in Your Methods Section
- 22.7. Exercises
- 23. Principal Components Analysis
- 23.1. Purpose and Interpretations of Principal Components Analysis
- 23.2. How PCA Is Computed
- 23.3. Distinguishing Significant from Nonsignificant Components
- 23.4. Rotating PCA Solutions
- 23.5. Time-Resolved PCA
- 23.6. PCA with Time-Frequency Information
- 23.7. PCA across Conditions
- 23.8. Independent Components Analysis
- 23.9. Describing This Method in Your Methods Section
- 23.10. Exercises
- 24. Basics of Single-Dipole and Distributed-Source Imaging
- 24.1. The Forward Solution
- 24.2. The Inverse Problem.
- 24.3. Dipole Fitting
- 24.4. Nonadaptive Distributed-Source Imaging Methods
- 24.5. Adaptive Distributed-Source Imaging
- 24.6. Theoretical and Practical Limits of Spatial Precision and Resolution
- pt. V Connectivity
- 25. Introduction to the Various Connectivity Analyses
- 25.1. Why Only Two Sites (Bivariate Connectivity)?
- 25.2. Important Concepts Related to Bivariate Connectivity
- 25.3. Which Measure of Connectivity Should Be Used?
- 25.4. Phase-Based Connectivity
- 25.5. Power-Based Connectivity
- 25.6. Granger Prediction
- 25.7. Mutual Information
- 25.8. Cross-Frequency Coupling
- 25.9. Graph Theory
- 25.10. Potential Confound of Volume Conduction
- 26. Phase-Based Connectivity
- 26.1. Terminology
- 26.2. ISPC over Time
- 26.3. ISPC-Trials
- 26.4. ISPC and the Number of Trials
- 26.5. Relation between ISPC and Power
- 26.6. Weighted ISPC-Trials
- 26.7. Spectral Coherence (Magnitude-Squared Coherence)
- 26.8. Phase Lag-Based Measures.
- 26.9. Which Measure of Phase Connectivity Should You Use?
- 26.10. Testing the Mean Phase Angle
- 26.11. Describing These Analyses in Your Methods Section
- 26.12. Exercises
- 27. Power-Based Connectivity
- 27.1. Spearman versus Pearson Coefficient for Power Correlations
- 27.2. Power Correlations over Time
- 27.3. Power Correlations over Trials
- 27.4. Partial Correlations
- 27.5. Matlab Programming Tips
- 27.6. Describing This Analysis in Your Methods Section
- 27.7. Exercises
- 28. Granger Prediction
- 28.1. Univariate Autoregression
- 28.2. Bivariate Autoregression
- 28.3. Autoregression Errors and Error Variances
- 28.4. Granger Prediction over Time
- 28.5. Model Order
- 28.6. Frequency Domain Granger Prediction
- 28.7. Time Series Covariance Stationarity
- 28.8. Baseline Normalization of Granger Prediction Results
- 28.9. Statistics
- 28.10. Additional Applications of Granger Prediction
- 28.11. Exercises
- 29. Mutual Information
- 29.1. Entropy.
- 29.2. How Many Histogram Bins to Use
- 29.3. Enjoy the Entropy
- 29.4. Joint Entropy
- 29.5. Mutual Information
- 29.6. Mutual Information and Amount of Data
- 29.7. Mutual Information with Noisy Data
- 29.8. Mutual Information over Time or over Trials
- 29.9. Mutual Information on Real Data
- 29.10. Mutual Information on Frequency-Band-Specific Data
- 29.11. Lagged Mutual Information
- 29.12. Statistics
- 29.13. More Information
- 29.14. Describing This Analysis in Your Methods Section
- 29.15. Exercises
- 30. Cross-Frequency Coupling
- 30.1. Visual Inspection of Cross-Frequency Coupling
- 30.2. Power-Power Correlations
- 30.3.A Priori Phase-Amplitude Coupling
- 30.4. Separating Task-Related Phase and Power Coactivations from Phase-Amplitude Coupling
- 30.5. Mixed A Priori/Exploratory Phase-Amplitude Coupling
- 30.6. Exploratory Phase-Amplitude Coupling
- 30.7. Notes about Phase-Amplitude Coupling
- 30.8. Phase-Phase Coupling.
- 30.9. Other Methods for Quantifying Cross-Frequency Coupling
- 30.10. Cross-Frequency Coupling over Time or over Trials
- 30.11. Describing This Analysis in Your Methods Section
- 30.12. Exercises
- 31. Graph Theory
- 31.1.Networks as Matrices and Graphs
- 31.2. Thresholding Connectivity Matrices
- 31.3. Connectivity Degree
- 31.3. Clustering Coefficient
- 31.4. Path Length
- 31.5. Small-World Networks
- 31.6. Statistics
- 31.7. How to Describe These Analyses in Your Paper
- 31.8. Exercises
- pt. VI Statistical Analyses
- 32. Advantages and Limitations of Different Statistical Procedures
- 32.1. Are Statistics Necessary?
- 32.2. At What Level Should Statistics Be Performed?
- 32.3. What p-Value Should Be Used, and Should Multiple-Comparisons Corrections Be Applied?
- 32.4. Are p-Values the Only Statistical Metric?
- 32.5. Statistical Significance versus Practical Significance
- 32.6. Type I and Type II Errors.
- 32.7. What Kinds of Statistics Should Be Applied?
- 32.8. How to Combine Data across Subjects
- 33. Nonparametric Permutation Testing
- 33.1. Advantages of Nonparametric Permutation Testing
- 33.2. Creating a Null-Hypothesis Distribution
- 33.3. How Many Iterations Are Necessary for the Null-Hypothesis Distribution?
- 33.4. Determining Statistical Significance
- 33.5. Multiple Comparisons and Their Corrections
- 33.6. Correction for Multiple Comparisons Using Pixel-Based Statistics
- 33.7. Corrections for Multiple Comparisons Using Cluster-Based Statistics
- 33.8. False Discovery Rate for Multiple-Comparisons Correction
- 33.9. What Should Be Permuted?
- 33.10. Nonparametric Permutation Testing beyond Simple Bivariate Cases
- 33.11. Describing This Analysis in Your Methods Section
- 34. Within-Subject Statistical Analyses
- 34.1. Changes in Task-Related Power Compared to Baseline
- 34.2. Discrete Condition Differences in Power.
- 34.3. Continuous Relationship with Power: Single-Trial Correlations
- 34.4. Continuous Relationships with Power: Single-Trial Multiple Regression
- 34.5. Determining Statistical Significance of Phase-Based Data
- 34.6. Testing Preferred Phase Angle across Conditions
- 34.7. Testing the Statistical Significance of Correlation Coefficients
- 35. Group-Level Analyses
- 35.1. Avoid Circular Inferences
- 35.2. Group-Level Analysis Strategy 1: Test Each Pixel and Apply a Mapwise Threshold
- 35.3. Group-Level Analysis Strategy 2a: Time-Frequency Windows for Hypothesis-Driven Analyses
- 35.4. Group-Level Analysis Strategy 2b: Subject-Specific Time-Frequency Windows for Hypothesis-Driven Analyses
- 35.5. Determining How Many Subjects You Need for Group-Level Analyses
- 36. Recommendations for Reporting Results in Figures, Tables, and Text
- 36.1. Recommendation 1: One Figure, One Idea
- 36.2. Recommendation 2: Show Data.
- 36.3. Recommendation 3: Highlight Significant Effects Instead of Removing Nonsignificant Effects
- 36.4. Recommendation 4: Show Specificity (or Lack Thereof) in Frequency, Time, and Space
- 36.5. Recommendation 5: Use Color
- 36.6. Recommendation 6: Use Informative Figure Labels and Captions
- 36.7. Recommendation 7: Avoid Showing "Representative" Data
- 36.8.A Checklist for Making Figures
- 36.9. Tables
- 36.10. Reporting Results in the Results Section
- pt. VII Conclusions and Future Directions
- 37. Recurring Themes in This Book and Some Personal Advice
- 37.1. Theme: Myriad Possible Analyses
- 37.2. Advice: Avoid the Paralysis of Analysis
- 37.3. Theme: You Don't Have to Program Your Own Analyses, but You Should Know How Analyses Work
- 37.4. Advice: If It Feels Wrong, It Probably Is
- 37.5. Advice: When in Doubt, Plot It Out
- 37.6. Advice: Know These Three Formulas like the Back of Your Hand
- 37.7. Theme: Connectivity over Trials or over Time.
- 37.8. Theme: Most Analysis Parameters Introduce Bias
- 37.9. Theme: Write a Clear Methods Section so Others Can Replicate Your Analyses
- 37.10. Theme: Use Descriptive and Appropriate Analysis Terms
- 37.11. Advice: Interpret Null Results Cautiously
- 37.12. Advice: Try Simulations but Also Trust Real Data
- 37.13. Advice: Trust Replications
- 37.14. Theme: Analyses Are Not Right or Wrong; They Are Appropriate or Inappropriate
- 37.15. Advice: Hypothesis Testing Is Good/Bad, and So Is Data-Driven Exploration
- 37.16. Advice: Find Something That Drives You and Study It
- 37.17. Cognitive Electrophysiology: The Art of Finding Anthills on Mountains
- 38. The Future of Cognitive Electrophysiology
- 38.1. Developments in Analysis Methods
- 38.2. Developments in Understanding the Neurophysiology of EEG
- 38.3. Developments in Experiment Design
- 38.4. Developments in Measurement Technology
- 38.5. The Role of the Body in Brain Function.
- 38.6. Determining Causality
- 38.7. Inferring Cognitive States from EEG Signatures: Inverse Inference
- 38.8. Tables of Activation
- 38.9. Disease Diagnosis and Predicting Treatment Course and Success
- 38.10. Clinical Relevance Is Not Necessary for the Advancement of Science
- 38.11. Replications
- 38.12. Double-Blind Review for Scientific Publications
- 38.13.?