Advances in independent component analysis

書誌事項

Advances in independent component analysis

Mark Girolami (ed.)

(Perspectives in neural computing)

Springer, c2000

大学図書館所蔵 件 / 32

この図書・雑誌をさがす

注記

Includes bibliographical references and index

内容説明・目次

内容説明

Independent Component Analysis (ICA) is a fast developing area of intense research interest. Following on from Self-Organising Neural Networks: Independent Component Analysis and Blind Signal Separation, this book reviews the significant developments of the past year. It covers topics such as the use of hidden Markov methods, the independence assumption, and topographic ICA, and includes tutorial chapters on Bayesian and variational approaches. It also provides the latest approaches to ICA problems, including an investigation into certain "hard problems" for the very first time. Comprising contributions from the most respected and innovative researchers in the field, this volume will be of interest to students and researchers in computer science and electrical engineering; research and development personnel in disciplines such as statistical modelling and data analysis; bio-informatic workers; and physicists and chemists requiring novel data analysis methods.

目次

I Temporal ICA Models.- 1 Hidden Markov Independent Component Analysis.- 1.1 Introduction.- 1.2 Hidden Markov Models.- 1.3 Independent Component Analysis.- 1.3.1 Generalised Exponential Sources.- 1.3.2 Generalised Autoregressive Sources.- 1.4 Hidden Markov ICA.- 1.4.1 Generalised Exponential Sources.- 1.4.2 Generalised Autoregressive Sources.- 1.5 Practical Issues.- 1.5.1 Initialisation.- 1.5.2 Learning.- 1.5.3 Model Order Selection.- 1.6 Results.- 1.6.1 Multiple Sinewave Sources.- 1.6.2 Same Sources, Different Mixing.- 1.6.3 Same Mixing, Different Sources.- 1.6.4 EEG Data.- 1.7 Conclusion.- 1.8 Acknowledgements.- 1.9 Appendix.- 2 Particle Filters for Non-Stationary ICA.- 2.1 Introduction.- 2.2 Stationary ICA.- 2.3 Non-Stationary Independent Component Analysis.- 2.3.1 Source Model.- 2.4 Particle Filters.- 2.4.1 Source Recovery.- 2.5 Illustration of Non-Stationary ICA.- 2.6 Smoothing.- 2.7 Temporal Correlations.- 2.8 Conclusion.- 2.8.1 Acknowledgement.- 2.9 Appendix: Laplace's Approximation for the Likelihood.- II The Validity of the Independence Assumption.- 3 The Independence Assumption: Analyzing the Independence of the Components by Topography.- 3.1 Introduction.- 3.2 Background: Independent Subspace Analysis.- 3.3 Topographic ICA Model.- 3.3.1 Dependence and Topography.- 3.3.2 Defining Topographic ICA.- 3.3.3 The Generative Model.- 3.3.4 Basic Properties of the Topographic ICA Model.- 3.4 Learning Rule.- 3.5 Comparison with Other Topographic Mappings.- 3.6 Experiments.- 3.6.1 Experiments in Feature Extraction of Image Data.- 3.6.2 Experiments in Feature Extraction of Audio Data.- 3.6.3 Experiments with Magnetoencephalographic Recordings.- 3.7 Conclusion.- 4 The Independence Assumption: Dependent Component Analysis.- 4.1 Introduction.- 4.2 Blind Source Separation by DCA.- 4.3 The "Cyclone" Algorithm.- 4.4 Experimental Results.- 4.5 Higher-Order Cyclostationary Signal Separation.- 4.6 Conclusion.- 4.7 Appendix: Proof of ACF Property 3.- III Ensemble Learning and Applications.- 5 Ensemble Learning.- 5.1 Introduction.- 5.2 Posterior Averages in Action.- 5.3 Approximations of Posterior PDF.- 5.4 Ensemble Learning.- 5.4.1 Model Selection in Ensemble Learning.- 5.4.2 Connection to Coding.- 5.4.3 EM and MAP.- 5.5 Construction of Probabilistic Models.- 5.5.1 Priors and Hyperpriors.- 5.6 Examples.- 5.6.1 Fixed Form Q.- 5.6.2 Free Form Q.- 5.7 Conclusion.- References.- 6 Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons.- 6.1 Introduction.- 6.2 Choosing Among Competing Explanations.- 6.3 Non-Linear Factor Analysis.- 6.3.1 Definition of the Model.- 6.3.2 Cost Function.- 6.3.3 Update Rules.- 6.4 Non-Linear Independent Factor Analysis.- 6.5 Experiments.- 6.5.1 Learning Scheme.- 6.5.2 Helix.- 6.5.3 Non-Linear Artificial Data.- 6.5.4 Process Data.- 6.6 Comparison with Existing Methods.- 6.6.1 SOM and GTM.- 6.6.2 Auto-Associative MLPs.- 6.6.3 Generative Learning with MLPs.- 6.7 Conclusion.- 6.7.1 Validity of the Approximations.- 6.7.2 Initial Inversion by Auxiliary MLP.- 6.7.3 Future Directions.- 6.8 Acknowledgements.- 7 Ensemble Learning for Blind Image Separation and Deconvolution.- 7.1 Introduction.- 7.2 Separation of Images.- 7.2.1 Learning the Ensemble.- 7.2.2 Learning the Model.- 7.2.3 Example.- 7.2.4 Parts-Based Image Decomposition.- 7.3 Deconvolution of Images.- 7.4 Conclusion.- 7.5 Acknowledgements.- References.- IV Data Analysis and Applications.- 8 Multi-Class Independent Component Analysis (MUCICA) for Rank-Deficient Distributions.- 8.1 Introduction.- 8.2 The Rank-Deficient One Class Problem.- 8.2.1 Method I: Three Blocks.- 8.2.2 Method II: Two Blocks.- 8.2.3 Method III: One Block.- 8.3 The Rank-Deficient Multi-Class Problem.- 8.4 Simulations.- 8.5 Conclusion.- References.- 9 Blind Separation of Noisy Image Mixtures.- 9.1 Introduction.- 9.2 The Likelihood.- 9.3 Estimation of Sources for the Case of Known Parameters.- 9.4 Joint Estimation of Sources, Mixing Matrix, and Noise Level.- 9.5 Simulation Example.- 9.6 Generalization and the Bias-Variance Dilemma.- 9.7 Application to Neuroimaging.- 9.8 Conclusion.- 9.9 Acknowledgments.- 9.10 Appendix: The Generalized Boltzmann Learning Rule.- 10 Searching for Independence in Electromagnetic Brain Waves.- 10.1 Introduction.- 10.2 Independent Component Analysis.- 10.2.1 The Model.- 10.2.2 The FastICA Algorithm.- 10.3 Electro- and Magnetoencephalography.- 10.4 On the Validity of the Linear ICA Model.- 10.5 The Analysis of EEG and MEG Data.- 10.5.1 Artifact Identification and Removal from EEG/MEG.- 10.5.2 Analysis of Multimodal Evoked Fields.- 10.5.3 Segmenting Auditory Evoked Fields.- 10.6 Conclusion.- 11 ICA on Noisy Data: A Factor Analysis Approach.- 11.1 Introduction.- 11.2 Factor Analysis and ICA.- 11.2.1 Factor Analysis.- 11.2.2 Factor Analysis in Preprocessing.- 11.2.3 ICA as Determining the Rotation Matrix.- 11.3 Experiment with Synthesized Data.- 11.4 MEG Data Analysis.- 11.4.1 Experiment with Phantom Data.- 11.4.2 Experiment with Real Brain Data.- 11.5 Conclusion.- 11.6 Acknowledgements.- 12 Analysis of Optical Imaging Data Using Weak Models and ICA.- 12.1 Introduction.- 12.2 Linear Component Analysis.- 12.3 Singular Value Decomposition.- 12.3.1 SVD Applied to OI Data Set.- 12.4 Independent Component Analysis.- 12.4.1 Minimisation Routines.- 12.4.2 Application of sICA to OI Data.- 12.5 The Weak Causal Model.- 12.5.1 Weak Causal Model Applied to the OI Data Set.- 12.5.2 Some Remarks on Significance Testing.- 12.6 The Weak Periodic Model.- 12.7 Regularised Weak Models.- 12.8 Regularised Weak Causal Model Applied to OI Data.- 12.9 Image Goodness and Multiple Models.- 12.10 A Last Look at the OI Data Set.- 12.11 Conclusion.- References.- 13 Independent Components in Text.- 13.1 Introduction.- 13.1.1 Vector Space Representations.- 13.1.2 Latent Semantic Indexing.- 13.2 Independent Component Analysis.- 13.2.1 Noisy Separation of Linear Mixtures.- 13.2.2 Learning ICA Text Representations on the LSI Space.- 13.2.3 Document Classification Based on Independent Components.- 13.2.4 Keywords from Context Vectors.- 13.2.5 Generalisation and the Bias-Variance Dilemma.- 13.3 Examples.- 13.3.1 MED Data Set.- 13.3.2 CRAN Data Set.- 13.4 Conclusion.- 14 Seeking Independence Using Biologically-Inspired ANN's.- 14.1 Introduction.- 14.2 The Negative Feedback Network.- 14.3 Independence in Unions of Sources.- 14.3.1 Factor Analysis.- 14.3.2 Minimal Overcomplete Bases.- 14.4 Canonical Correlation Analysis.- 14.4.1 Extracting Multiple Correlations.- 14.4.2 Using Minimum Correlations to Extract Independent Sources.- 14.4.3 Experiments.- 14.5 ?-Insensitive Hebbian Learning.- 14.5.1 Is this a Hebbian Rule?.- 14.5.2 Extraction of Sinusoids.- 14.5.3 Noise Reduction.- 14.6 Conclusion.- References.

「Nielsen BookData」 より

関連文献: 1件中  1-1を表示

詳細情報

ページトップへ