|
|
|
|
LEADER |
00000cam a2200000 a 4500 |
001 |
EBOOKCENTRAL_ocn759530314 |
003 |
OCoLC |
005 |
20240329122006.0 |
006 |
m o d |
007 |
cr cn||||||||| |
008 |
111103s2011 njua ob 001 0 eng d |
040 |
|
|
|a UIU
|b eng
|e pn
|c UIU
|d YDXCP
|d WAU
|d CDX
|d COO
|d OCLCF
|d EBLCP
|d N$T
|d DEBSZ
|d AU@
|d OHS
|d E7B
|d IDEBK
|d REDDC
|d CHVBK
|d OCLCQ
|d DG1
|d Z5A
|d LIP
|d OCLCQ
|d MERUC
|d OCLCQ
|d WYU
|d YOU
|d TKN
|d U3W
|d OCLCQ
|d UKAHL
|d OL$
|d OCLCQ
|d OCLCO
|d OCLCQ
|d OCLCO
|d OCLCL
|d OCLCQ
|d OCLCL
|
019 |
|
|
|a 742798716
|a 794326191
|a 808669906
|a 816882892
|a 880752033
|a 1066494002
|
020 |
|
|
|a 9781119995678
|q (electronic bk.)
|
020 |
|
|
|a 1119995671
|q (electronic bk.)
|
020 |
|
|
|a 9781119995685
|q (electronic bk.)
|
020 |
|
|
|a 111999568X
|q (electronic bk.)
|
020 |
|
|
|z 9781119993896
|q (cloth)
|
020 |
|
|
|z 111999389X
|q (cloth)
|
024 |
8 |
|
|a 9786613405593
|
029 |
1 |
|
|a AU@
|b 000050012976
|
029 |
1 |
|
|a AU@
|b 000052898899
|
029 |
1 |
|
|a AU@
|b 000061128687
|
029 |
1 |
|
|a CHNEW
|b 000613817
|
029 |
1 |
|
|a CHNEW
|b 000938447
|
029 |
1 |
|
|a CHVBK
|b 480187541
|
029 |
1 |
|
|a DEBBG
|b BV043393664
|
029 |
1 |
|
|a DEBSZ
|b 372696066
|
029 |
1 |
|
|a DEBSZ
|b 396994733
|
029 |
1 |
|
|a DEBSZ
|b 425883728
|
029 |
1 |
|
|a DEBSZ
|b 430990596
|
029 |
1 |
|
|a DEBSZ
|b 44924119X
|
029 |
1 |
|
|a DEBSZ
|b 48500822X
|
029 |
1 |
|
|a NZ1
|b 13751537
|
029 |
1 |
|
|a NZ1
|b 15341321
|
035 |
|
|
|a (OCoLC)759530314
|z (OCoLC)742798716
|z (OCoLC)794326191
|z (OCoLC)808669906
|z (OCoLC)816882892
|z (OCoLC)880752033
|z (OCoLC)1066494002
|
050 |
|
4 |
|a QA273.6
|b .M59 2011
|
072 |
|
7 |
|a MAT
|x 029000
|2 bisacsh
|
082 |
0 |
4 |
|a 519.2/4
|2 22
|
084 |
|
|
|a SK 830
|2 rvk
|
049 |
|
|
|a UAMI
|
245 |
0 |
0 |
|a Mixtures :
|b estimation and applications /
|c edited by Kerrie L. Mengersen, Christian P. Robert, D. Michael Titterington.
|
260 |
|
|
|a Hoboken, N.J. :
|b Wiley,
|c 2011.
|
300 |
|
|
|a 1 online resource (xviii, 311 pages) :
|b illustrations
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
504 |
|
|
|a Includes bibliographical references and index.
|
505 |
0 |
0 |
|g Machine generated contents note:
|g 1.
|t The EM algorithm, variational approximations and expectation propagation for mixtures /
|r D. Michael Titterington --
|g 1.1.
|t Preamble --
|g 1.2.
|t The EM algorithm --
|g 1.2.1.
|t Introduction to the algorithm --
|g 1.2.2.
|t The E-step and the M-step for the mixing weights --
|g 1.2.3.
|t The M-step for mixtures of univariate Gaussian distributions --
|g 1.2.4.
|t M-step for mixtures of regular exponential family distributions formulated in terms of the natural parameters --
|g 1.2.5.
|t Application to other mixtures --
|g 1.2.6.
|t EM as a double expectation --
|g 1.3.
|t Variational approximations --
|g 1.3.1.
|t Preamble --
|g 1.3.2.
|t Introduction to variational approximations --
|g 1.3.3.
|t Application of variational Bayes to mixture problems --
|g 1.3.4.
|t Application to other mixture problems --
|g 1.3.5.
|t Recursive variational approximations --
|g 1.3.6.
|t Asymptotic results --
|g 1.4.
|t Expectation-propagation --
|g 1.4.1.
|t Introduction --
|g 1.4.2.
|t Overview of the recursive approach to be adopted.
|
505 |
0 |
0 |
|g 1.4.3.
|t Finite Gaussian mixtures with an unknown mean parameter --
|g 1.4.4.
|t Mixture of two known distributions --
|g 1.4.5.
|t Discussion --
|t Acknowledgements --
|t References --
|g 2.
|t Online expectation maximisation /
|r Olivier Cappe --
|g 2.1.
|t Introduction --
|g 2.2.
|t Model and assumptions --
|g 2.3.
|t The EM algorithm and the limiting EM recursion --
|g 2.3.1.
|t The batch EM algorithm --
|g 2.3.2.
|t The limiting EM recursion --
|g 2.3.3.
|t Limitations of batch EM for long data records --
|g 2.4.
|t Online expectation maximisation --
|g 2.4.1.
|t The algorithm --
|g 2.4.2.
|t Convergence properties --
|g 2.4.3.
|t Application to finite mixtures --
|g 2.4.4.
|t Use for batch maximum-likelihood estimation --
|g 2.5.
|t Discussion --
|t References --
|g 3.
|t The limiting distribution of the EM test of the order of a finite mixture /
|r Pengfei Li --
|g 3.1.
|t Introduction --
|g 3.2.
|t The method and theory of the EM test --
|g 3.2.1.
|t The definition of the EM test statistic --
|g 3.2.2.
|t The limiting distribution of the EM test statistic --
|g 3.3.
|t Proofs.
|
505 |
0 |
0 |
|g 3.4.
|t Discussion --
|t References --
|g 4.
|t Comparing Wald and likelihood regions applied to locally identifiable mixture models /
|r Bruce G. Lindsay --
|g 4.1.
|t Introduction --
|g 4.2.
|t Background on likelihood confidence regions --
|g 4.2.1.
|t Likelihood regions --
|g 4.2.2.
|t Profile likelihood regions --
|g 4.2.3.
|t Alternative methods --
|g 4.3.
|t Background on simulation and visualisation of the likelihood regions --
|g 4.3.1.
|t Modal simulation method --
|g 4.3.2.
|t Illustrative example --
|g 4.4.
|t Comparison between the likelihood regions and the Wald regions --
|g 4.4.1.
|t Volume/volume error of the confidence regions --
|g 4.4.2.
|t Differences in univariate intervals via worst case analysis --
|g 4.4.3.
|t Illustrative example (revisited) --
|g 4.5.
|t Application to a finite mixture model --
|g 4.5.1.
|t Nonidentifiabilities and likelihood regions for the mixture parameters --
|g 4.5.2.
|t Mixture likelihood region simulation and visualisation --
|g 4.5.3.
|t Adequacy of using the Wald confidence region.
|
505 |
0 |
0 |
|g 4.6.
|t Data analysis --
|g 4.7.
|t Discussion --
|t References --
|g 5.
|t Mixture of experts modelling with social science applications /
|r Thomas Brendan Murphy --
|g 5.1.
|t Introduction --
|g 5.2.
|t Motivating examples --
|g 5.2.1.
|t Voting blocs --
|g 5.2.2.
|t Social and organisational structure --
|g 5.3.
|t Mixture models --
|g 5.4.
|t Mixture of experts models --
|g 5.5.
|t A mixture of experts model for ranked preference data --
|g 5.5.1.
|t Examining the clustering structure --
|g 5.6.
|t A mixture of experts latent position cluster model --
|g 5.7.
|t Discussion --
|t Acknowledgements --
|t References --
|g 6.
|t Modelling conditional densities using finite smooth mixtures /
|r Robert Kohn --
|g 6.1.
|t Introduction --
|g 6.2.
|t The model and prior --
|g 6.2.1.
|t Smooth mixtures --
|g 6.2.2.
|t The component models --
|g 6.2.3.
|t The prior --
|g 6.3.
|t Inference methodology --
|g 6.3.1.
|t The general MCMC scheme --
|g 6.3.2.
|t Updating & beta; and I using variable-dimension finite-step Newton proposals --
|g 6.3.3.
|t Model comparison --
|g 6.4.
|t Applications --
|g 6.4.1.
|t A small simulation study.
|
505 |
0 |
0 |
|g 6.4.2.
|t LIDAR data --
|g 6.4.3.
|t Electricity expenditure data --
|g 6.5.
|t Conclusions --
|t Acknowledgements --
|t Appendix: Implementation details for the gamma and log-normal models --
|t References --
|g 7.
|t Nonparametric mixed membership modelling using the IBP compound Dirichlet process /
|r David M. Blei --
|g 7.1.
|t Introduction --
|g 7.2.
|t Mixed membership models --
|g 7.2.1.
|t Latent Dirichlet allocation --
|g 7.2.2.
|t Nonparametric mixed membership models --
|g 7.3.
|t Motivation --
|g 7.4.
|t Decorrelating prevalence and proportion --
|g 7.4.1.
|t Indian buffet process --
|g 7.4.2.
|t The IBP compound Dirichlet process --
|g 7.4.3.
|t An application of the ICD: focused topic models --
|g 7.4.4.
|t Inference --
|g 7.5.
|t Related models --
|g 7.6.
|t Empirical studies --
|g 7.7.
|t Discussion --
|t References --
|g 8.
|t Discovering nonbinary hierarchical structures with Bayesian rose trees /
|r Katherine A. Heller --
|g 8.1.
|t Introduction --
|g 8.2.
|t Prior work --
|g 8.3.
|t Rose trees, partitions and mixtures --
|g 8.4.
|t Avoiding needless cascades --
|g 8.4.1.
|t Cluster models.
|
505 |
0 |
0 |
|g 8.5.
|t Greedy construction of Bayesian rose tree mixtures --
|g 8.5.1.
|t Prediction --
|g 8.5.2.
|t Hyperparameter optimisation --
|g 8.6.
|t Bayesian hierarchical clustering, Dirichlet process models and product partition models --
|g 8.6.1.
|t Mixture models and product partition models --
|g 8.6.2.
|t PCluster and Bayesian hierarchical clustering --
|g 8.7.
|t Results --
|g 8.7.1.
|t Optimality of tree structure --
|g 8.7.2.
|t Hierarchy likelihoods --
|g 8.7.3.
|t Partially observed data --
|g 8.7.4.
|t Psychological hierarchies --
|g 8.7.5.
|t Hierarchies of Gaussian process experts --
|g 8.8.
|t Discussion --
|t References --
|g 9.
|t Mixtures of factor analysers for the analysis of high-dimensional data /
|r Suren I. Rathnayake --
|g 9.1.
|t Introduction --
|g 9.2.
|t Single-factor analysis model --
|g 9.3.
|t Mixtures of factor analysers --
|g 9.4.
|t Mixtures of common factor analysers (MCFA) --
|g 9.5.
|t Some related approaches --
|g 9.6.
|t Fitting of factor-analytic models --
|g 9.7.
|t Choice of the number of factors q --
|g 9.8.
|t Example --
|g 9.9.
|t Low-dimensional plots via MCFA approach.
|
505 |
0 |
0 |
|g 9.10.
|t Multivariate t-factor analysers --
|g 9.11.
|t Discussion --
|t Appendix --
|t References --
|g 10.
|t Dealing with label switching under model uncertainty /
|r Sylvia Fruhwirth-Schnatter --
|g 10.1.
|t Introduction --
|g 10.2.
|t Labelling through clustering in the point-process representation --
|g 10.2.1.
|t The point-process representation of a finite mixture model --
|g 10.2.2.
|t Identification through clustering in the point-process representation --
|g 10.3.
|t Identifying mixtures when the number of components is unknown --
|g 10.3.1.
|t The role of Dirichlet priors in overfitting mixtures --
|g 10.3.2.
|t The meaning of K for overfitting mixtures --
|g 10.3.3.
|t The point-process representation of overfitting mixtures --
|g 10.3.4.
|t Examples --
|g 10.4.
|t Overfitting heterogeneity of component-specific parameters --
|g 10.4.1.
|t Overfitting heterogeneity --
|g 10.4.2.
|t Using shrinkage priors on the component-specific location parameters --
|g 10.5.
|t Concluding remarks --
|t References --
|g 11.
|t Exact Bayesian analysis of mixtures /
|r Kerrie L. Mengersen.
|
505 |
0 |
0 |
|g 11.1.
|t Introduction --
|g 11.2.
|t Formal derivation of the posterior distribution --
|g 11.2.1.
|t Locally conjugate priors --
|g 11.2.2.
|t True posterior distributions --
|g 11.2.3.
|t Poisson mixture --
|g 11.2.4.
|t Multinomial mixtures --
|g 11.2.5.
|t Normal mixtures --
|t References --
|g 12.
|t Manifold MCMC for mixtures /
|r Mark Girolami --
|g 12.1.
|t Introduction --
|g 12.2.
|t Markov chain Monte Carlo Methods --
|g 12.2.1.
|t Metropolis-Hastings --
|g 12.2.2.
|t Gibbs sampling --
|g 12.2.3.
|t Manifold Metropolis adjusted Langevin algorithm --
|g 12.2.4.
|t Manifold Hamiltonian Monte Carlo --
|g 12.3.
|t Finite Gaussian mixture models --
|g 12.3.1.
|t Gibbs sampler for mixtures of univariate Gaussians --
|g 12.3.2.
|t Manifold MCMC for mixtures of univariate Gaussians --
|g 12.3.3.
|t Metric tensor --
|g 12.3.4.
|t An illustrative example --
|g 12.4.
|t Experiments --
|g 12.5.
|t Discussion --
|t Acknowledgements --
|t Appendix --
|t References --
|g 13.
|t How many components in a finite mixture? /
|r Murray Aitkin --
|g 13.1.
|t Introduction --
|g 13.2.
|t The galaxy data --
|g 13.3.
|t The normal mixture model.
|
505 |
0 |
0 |
|g 13.4.
|t Bayesian analyses --
|g 13.4.1.
|t Escobar and West --
|g 13.4.2.
|t Phillips and Smith --
|g 13.4.3.
|t Roeder and Wasserman --
|g 13.4.4.
|t Richardson and Green --
|g 13.4.5.
|t Stephens --
|g 13.5.
|t Posterior distributions for K (for flat prior) --
|g 13.6.
|t Conclusions from the Bayesian analyses --
|g 13.7.
|t Posterior distributions of the model deviances --
|g 13.8.
|t Asymptotic distributions --
|g 13.9.
|t Posterior deviances for the galaxy data --
|g 13.10.
|t Conclusions --
|t References --
|g 14.
|t Bayesian mixture models: a blood-free dissection of a sheep /
|r Graham E. Gardner --
|g 14.1.
|t Introduction --
|g 14.2.
|t Mixture models --
|g 14.2.1.
|t Hierarchical normal mixture --
|g 14.3.
|t Altering dimensions of the mixture model --
|g 14.4.
|t Bayesian mixture model incorporating spatial information --
|g 14.4.1.
|t Results --
|g 14.5.
|t Volume calculation --
|g 14.6.
|t Discussion --
|t References.
|
588 |
0 |
|
|a Print version record.
|
520 |
|
|
|a This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subje.
|
590 |
|
|
|a ProQuest Ebook Central
|b Ebook Central Academic Complete
|
650 |
|
0 |
|a Mixture distributions (Probability theory)
|
650 |
|
6 |
|a Distribution composée (Théorie des probabilités)
|
650 |
|
7 |
|a MATHEMATICS
|x Probability & Statistics
|x General.
|2 bisacsh
|
650 |
|
7 |
|a Mixture distributions (Probability theory)
|2 fast
|
700 |
1 |
|
|a Mengersen, Kerrie L.
|
700 |
1 |
|
|a Robert, Christian P.,
|d 1961-
|1 https://id.oclc.org/worldcat/entity/E39PBJvxtCGQCFTQpWgvtWJ7HC
|
700 |
1 |
|
|a Titterington, D. M.
|
758 |
|
|
|i has work:
|a Mixtures (Text)
|1 https://id.oclc.org/worldcat/entity/E39PCGP84cPpbmfwjmYjg9RYRq
|4 https://id.oclc.org/worldcat/ontology/hasWork
|
776 |
0 |
8 |
|i Print version:
|t Mixtures.
|d Hoboken, N.J. : Wiley, 2011
|z 9781119993896
|w (DLC) 2010053469
|w (OCoLC)698450396
|
856 |
4 |
0 |
|u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=693765
|z Texto completo
|
938 |
|
|
|a Askews and Holts Library Services
|b ASKH
|n AH21634490
|
938 |
|
|
|a Askews and Holts Library Services
|b ASKH
|n AH21634294
|
938 |
|
|
|a Coutts Information Services
|b COUT
|n 20575755
|
938 |
|
|
|a EBL - Ebook Library
|b EBLB
|n EBL693765
|
938 |
|
|
|a ebrary
|b EBRY
|n ebr10510417
|
938 |
|
|
|a EBSCOhost
|b EBSC
|n 509901
|
938 |
|
|
|a ProQuest MyiLibrary Digital eBook Collection
|b IDEB
|n 340559
|
938 |
|
|
|a YBP Library Services
|b YANK
|n 6628095
|
938 |
|
|
|a YBP Library Services
|b YANK
|n 3588249
|
994 |
|
|
|a 92
|b IZTAP
|