Mixtures : estimation and applications
Author(s)
Bibliographic Information
Mixtures : estimation and applications
(Wiley series in probability and mathematical statistics)
Wiley, 2011
- : cloth
- Other Title
-
Wiley series in probability and statistics
Available at / 20 libraries
-
No Libraries matched.
- Remove all filters.
Note
Series title on cover: Wiley series in probability and statistics
"This edited volume was simulated by a workshop entitled 'Mixture Estimation and Applications' held at the International Centre for Mathematical Science (ICMS) in Edinburgh on 3-5 March 2010"--Pref
Includes bibliographical references and index
Description and Table of Contents
Description
This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subject. The applications are drawn from scientific discipline, including biostatistics, computer science, ecology and finance. This area of statistics is important to a range of disciplines, and its methodology attracts interest from researchers in the fields in which it can be applied.
Table of Contents
Preface Acknowledgements
List of Contributors
1 The EM algorithm, variational approximations and expectation propagation for mixtures
D.Michael Titterington
1.1 Preamble
1.2 The EM algorithm
1.3 Variational approximations
1.4 Expectation-propagation
Acknowledgements
References
2 Online expectation maximisation
Olivier Cappe
2.1 Introduction
2.2 Model and assumptions
2.3 The EM algorithm and the limiting EM recursion
2.4 Online expectation maximisation
2.5 Discussion
References
3 The limiting distribution of the EM test of the order of a finite mixture
J. Chen and Pengfei Li
3.1 Introduction
3.2 The method and theory of the EM test
3.3 Proofs
3.4 Discussion
References
4 Comparing Wald and likelihood regions applied to locally identifiable mixture models
Daeyoung Kim and Bruce G. Lindsay
4.1 Introduction
4.2 Background on likelihood confidence regions
4.3 Background on simulation and visualisation of the likelihood regions
4.4 Comparison between the likelihood regions and the Wald regions
4.5 Application to a finite mixture model
4.6 Data analysis
4.7 Discussion
References
5 Mixture of experts modelling with social science applications
Isobel Claire Gormley and Thomas Brendan Murphy
5.1 Introduction
5.2 Motivating examples
5.3 Mixture models
5.4 Mixture of experts models
5.5 A Mixture of experts model for ranked preference data
5.6 A Mixture of experts latent position cluster model
5.7 Discussion
Acknowledgements
References
6 Modelling conditional densities using finite smooth mixtures
Feng Li, Mattias Villani and Robert Kohn
6.1 Introduction
6.2 The model and prior
6.3 Inference methodology
6.4 Applications
6.5 Conclusions
Acknowledgements
Appendix: Implementation details for the gamma and log-normal models
References
7 Nonparametric mixed membership modelling using the IBP compound Dirichlet process
Sinead Williamson, Chong Wang, Katherine A. Heller, and David M. Blei
7.1 Introduction
7.2 Mixed membership models
7.3 Motivation
7.4 Decorrelating prevalence and proportion
7.5 Related models
7.6 Empirical studies
7.7 Discussion
References
8 Discovering nonbinary hierarchical structures with Bayesian rose trees
Charles Blundell, Yee Whye Teh, and Katherine A. Heller
8.1 Introduction
8.2 Prior work
8.3 Rose trees, partitions and mixtures
8.4 Greedy Construction of Bayesian Rose Tree Mixtures
8.5 Bayesian hierarchical clustering, Dirichlet process models and product partition models
8.6 Results
8.7 Discussion
References
9 Mixtures of factor analyzers for the analysis of high-dimensional data
Geoffrey J. McLachlan, Jangsun Baek, and Suren I. Rathnayake
9.1 Introduction
9.2 Single-factor analysis model
9.3 Mixtures of factor analyzers
9.4 Mixtures of common factor analyzers (MCFA)
9.5 Some related approaches
9.6 Fitting of factor-analytic models
9.7 Choice of the number of factors q
9.8 Example
9.9 Low-dimensional plots via MCFA approach
9.10 Multivariate t-factor analysers
9.11 Discussion
Appendix
References
10 Dealing with Label Switching under model uncertainty
Sylvia Fruhwirth-Schnatter
10.1 Introduction
10.2 Labelling through clustering in the point-process representation
10.3 Identifying mixtures when the number of components is unknown
10.4 Overfitting heterogeneity of component-specific parameters
10.5 Concluding remarks
References
11 Exact Bayesian analysis of mixtures
Christian .P. Robert and Kerrie L. Mengersen
11.1 Introduction
11.2 Formal derivation of the posterior distribution
References
12 Manifold MCMC for mixtures
Vassilios Stathopoulos and Mark Girolami
12.1 Introduction
12.2 Markov chain Monte Carlo methods
12.3 Finite Gaussian mixture models
12.4 Experiments
12.5 Discussion
Acknowledgements
Appendix
References
13 How many components in a finite mixture?
Murray Aitkin
13.1 Introduction
13.2 The galaxy data
13.3 The normal mixture model
13.4 Bayesian analyses
13.5 Posterior distributions for K (for flat prior)
13.6 Conclusions from the Bayesian analyses
13.7 Posterior distributions of the model deviances
13.8 Asymptotic distributions
13.9 Posterior deviances for the galaxy data
13.10 Conclusion
References
14 Bayesian mixture models: a blood-free dissection of a sheep
Clair L. Alston, Kerrie L. Mengersen, and Graham E. Gardner
14.1 Introduction
14.2 Mixture models
14.3 Altering dimensions of the mixture model
14.4 Bayesian mixture model incorporating spatial information
14.5 Volume calculation
14.6 Discussion
References
Index.
by "Nielsen BookData"