|
|
|
|
LEADER |
00000cam a2200000 a 4500 |
001 |
OR_ocn778857701 |
003 |
OCoLC |
005 |
20231017213018.0 |
006 |
m o d |
007 |
cr un|---uuuuu |
008 |
120229s2012 nju ob 001 0 eng |
010 |
|
|
|a 2012008714
|
040 |
|
|
|a DLC
|b eng
|e pn
|c DLC
|d YDX
|d N$T
|d E7B
|d YDXCP
|d UKMGB
|d DEBSZ
|d CDX
|d COO
|d UPM
|d UMI
|d OCLCQ
|d OCLCF
|d OCLCQ
|d LOA
|d MOR
|d PIFAG
|d VGM
|d ESU
|d OCLCQ
|d NJR
|d BUF
|d OCLCQ
|d UUM
|d WRM
|d COCUF
|d CEF
|d NRAMU
|d VT2
|d AZK
|d ZEM
|d OCLCQ
|d WYU
|d UAB
|d AU@
|d MM9
|d OCLCQ
|d OCLCO
|d OCLCQ
|
016 |
7 |
|
|a 016099008
|2 Uk
|
019 |
|
|
|a 848895177
|a 961414420
|a 961825262
|a 1170722408
|a 1170993179
|a 1173484527
|
020 |
|
|
|a 9781118359778
|q (ePub)
|
020 |
|
|
|a 1118359771
|q (ePub)
|
020 |
|
|
|a 9781118359754
|q (Adobe PDF)
|
020 |
|
|
|a 1118359755
|q (Adobe PDF)
|
020 |
|
|
|a 9781118359761
|q (MobiPocket)
|
020 |
|
|
|a 1118359763
|q (MobiPocket)
|
020 |
|
|
|a 1118332571
|
020 |
|
|
|a 9781118332573
|
020 |
|
|
|a 9781280775765
|q (MyiLibrary)
|
020 |
|
|
|a 1280775769
|q (MyiLibrary)
|
020 |
|
|
|z 9781118166406
|q (hardback)
|
029 |
1 |
|
|a AU@
|b 000048713111
|
029 |
1 |
|
|a AU@
|b 000052007811
|
029 |
1 |
|
|a CHNEW
|b 000721602
|
029 |
1 |
|
|a DEBBG
|b BV041430781
|
029 |
1 |
|
|a DEBSZ
|b 372597319
|
029 |
1 |
|
|a DEBSZ
|b 398265801
|
029 |
1 |
|
|a NLGGC
|b 34362642X
|
029 |
1 |
|
|a NZ1
|b 15317509
|
029 |
1 |
|
|a AU@
|b 000069138013
|
035 |
|
|
|a (OCoLC)778857701
|z (OCoLC)848895177
|z (OCoLC)961414420
|z (OCoLC)961825262
|z (OCoLC)1170722408
|z (OCoLC)1170993179
|z (OCoLC)1173484527
|
037 |
|
|
|a CL0500000223
|b Safari Books Online
|
042 |
|
|
|a pcc
|
050 |
0 |
0 |
|a QA279.5
|
072 |
|
7 |
|a MAT
|x 029010
|2 bisacsh
|
082 |
0 |
0 |
|a 519.5/42
|2 23
|
084 |
|
|
|a MAT029010
|2 bisacsh
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Lee, Peter M.
|
245 |
1 |
0 |
|a Bayesian statistics :
|b an introduction /
|c Peter M. Lee.
|
250 |
|
|
|a 4th ed.
|
260 |
|
|
|a Chichester, West Sussex ;
|a Hoboken, N.J. :
|b Wiley,
|c 2012.
|
300 |
|
|
|a 1 online resource
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
347 |
|
|
|a data file
|2 rda
|
380 |
|
|
|a Bibliography
|
520 |
|
|
|a "--Presents extensive examples throughout the book to complement the theory presented. Includes significant new material on recent techniques such as variational methods, importance sampling, approximate computation and reversible jump MCMC"--
|c Provided by publisher.
|
504 |
|
|
|a Includes bibliographical references and index.
|
588 |
0 |
|
|a Print version record and CIP data provided by publisher.
|
505 |
0 |
|
|a Note continued: 7.3. Informative stopping rules -- 7.3.1. An example on capture and recapture of fish -- 7.3.2. Choice of prior and derivation of posterior -- 7.3.3. The maximum likelihood estimator -- 7.3.4. Numerical example -- 7.4. The likelihood principle and reference priors -- 7.4.1. The case of Bernoulli trials and its general implications -- 7.4.2. Conclusion -- 7.5. Bayesian decision theory -- 7.5.1. The elements of game theory -- 7.5.2. Point estimators resulting from quadratic loss -- 7.5.3. Particular cases of quadratic loss -- 7.5.4. Weighted quadratic loss -- 7.5.5. Absolute error loss -- 7.5.6. Zero-one loss -- 7.5.7. General discussion of point estimation -- 7.6. Bayes linear methods -- 7.6.1. Methodology -- 7.6.2. Some simple examples -- 7.6.3. Extensions -- 7.7. Decision theory and hypothesis testing -- 7.7.1. Relationship between decision theory and classical hypothesis testing -- 7.7.2.Composite hypotheses -- 7.8. Empirical Bayes methods -- 7.8.1. Von Mises' example -- 7.8.2. The Poisson case -- 7.9. Exercises on Chapter 7 -- 8. Hierarchical models -- 8.1. The idea of a hierarchical model -- 8.1.1. Definition -- 8.1.2. Examples -- 8.1.3. Objectives of a hierarchical analysis -- 8.1.4. More on empirical Bayes methods -- 8.2. The hierarchical normal model -- 8.2.1. The model -- 8.2.2. The Bayesian analysis for known overall mean -- 8.2.3. The empirical Bayes approach -- 8.3. The baseball example -- 8.4. The Stein estimator -- 8.4.1. Evaluation of the risk of the James-Stein estimator -- 8.5. Bayesian analysis for an unknown overall mean -- 8.5.1. Derivation of the posterior -- 8.6. The general linear model revisited -- 8.6.1. An informative prior for the general linear model -- 8.6.2. Ridge regression -- 8.6.3.A further stage to the general linear model -- 8.6.4. The one way model -- 8.6.5. Posterior variances of the estimators -- 8.7. Exercises on Chapter 8 -- 9. The Gibbs sampler and other numerical methods -- 9.1. Introduction to numerical methods -- 9.1.1. Monte Carlo methods -- 9.1.2. Markov chains -- 9.2. The EM algorithm -- 9.2.1. The idea of the EM algorithm -- 9.2.2. Why the EM algorithm works -- 9.2.3. Semi-conjugate prior with a normal likelihood -- 9.2.4. The EM algorithm for the hierarchical normal model -- 9.2.5.A particular case of the hierarchical normal model -- 9.3. Data augmentation by Monte Carlo -- 9.3.1. The genetic linkage example revisited -- 9.3.2. Use of R -- 9.3.3. The genetic linkage example in R -- 9.3.4. Other possible uses for data augmentation -- 9.4. The Gibbs sampler -- 9.4.1. Chained data augmentation -- 9.4.2. An example with observed data -- 9.4.3. More on the semi-conjugate prior with a normal likelihood -- 9.4.4. The Gibbs sampler as an extension of chained data augmentation -- 9.4.5. An application to change-point analysis -- 9.4.6. Other uses of the Gibbs sampler -- 9.4.7. More about convergence -- 9.5. Rejection sampling -- 9.5.1. Description -- 9.5.2. Example -- 9.5.3. Rejection sampling for log-concave distributions -- 9.5.4.A practical example -- 9.6. The Metropolis-Hastings algorithm -- 9.6.1. Finding an invariant distribution -- 9.6.2. The Metropolis-Hastings algorithm -- 9.6.3. Choice of a candidate density -- 9.6.4. Example -- 9.6.5. More realistic examples -- 9.6.6. Gibbs as a special case of Metropolis-Hastings -- 9.6.7. Metropolis within Gibbs -- 9.7. Introduction to WinBUGS and OpenBUGS -- 9.7.1. Information about WinBUGS and OpenBUGS -- 9.7.2. Distributions in WinBUGS and OpenBUGS -- 9.7.3.A simple example using WinBUGS -- 9.7.4. The pump failure example revisited -- 9.7.5. DoodleBUGS -- 9.7.6.coda -- 9.7.7.R2WinBUGS and R2OpenBUGS -- 9.8. Generalized linear models -- 9.8.1. Logistic regression -- 9.8.2.A general framework -- 9.9. Exercises on Chapter 9 -- 10. Some approximate methods -- 10.1. Bayesian importance sampling -- 10.1.1. Importance sampling to find HDRs -- 10.1.2. Sampling importance re-sampling -- 10.1.3. Multidimensional applications -- 10.2. Variational Bayesian methods: simple case -- 10.2.1. Independent parameters -- 10.2.2. Application to the normal distribution -- 10.2.3. Updating the mean -- 10.2.4. Updating the variance -- 10.2.5. Iteration -- 10.2.6. Numerical example -- 10.3. Variational Bayesian methods: general case -- 10.3.1.A mixture of multivariate normals -- 10.4. ABC: Approximate Bayesian Computation -- 10.4.1. The ABC rejection algorithm -- 10.4.2. The genetic linkage example -- 10.4.3. The ABC Markov Chain Monte Carlo algorithm -- 10.4.4. The ABC Sequential Monte Carlo algorithm -- 10.4.5. The ABC local linear regression algorithm -- 10.4.6. Other variants of ABC -- 10.5. Reversible jump Markov chain Monte Carlo -- 10.5.1. RJMCMC algorithm -- 10.6. Exercises on Chapter 10 -- Appendix A Common statistical distributions -- A.1. Normal distribution -- A.2. Chi-squared distribution -- A.3. Normal approximation to chi-squared -- A.4. Gamma distribution -- A.5. Inverse chi-squared distribution -- A.6. Inverse chi distribution -- A.7. Log chi-squared distribution -- A.8. Student's t distribution -- A.9. Normal/chi-squared distribution -- A.10. Beta distribution -- A.11. Binomial distribution -- A.12. Poisson distribution -- A.13. Negative binomial distribution -- A.14. Hypergeometric distribution -- A.15. Uniform distribution -- A.16. Pareto distribution -- A.17. Circular normal distribution -- A.18. Behrens' distribution -- A.19. Snedecor's F distribution -- A.20. Fisher's z distribution -- A.21. Cauchy distribution -- A.22. The probability that one beta variable is greater than another -- A.23. Bivariate normal distribution -- A.24. Multivariate normal distribution -- A.25. Distribution of the correlation coefficient -- Appendix B Tables -- B.1. Percentage points of the Behrens-Fisher distribution -- B.2. Highest density regions for the chi-squared distribution -- B.3. HDRs for the inverse chi-squared distribution -- B.4. Chi-squared corresponding to HDRs for log chi-squared -- B.5. Values of F corresponding to HDRs for log F -- Appendix C R programs -- Appendix D Further reading -- D.1. Robustness -- D.2. Nonparametric methods -- D.3. Multivariate estimation -- D.4. Time series and forecasting -- D.5. Sequential methods -- D.6. Numerical methods -- D.7. Bayesian networks -- D.8. General reading.
|
590 |
|
|
|a O'Reilly
|b O'Reilly Online Learning: Academic/Public Library Edition
|
650 |
|
0 |
|a Bayesian statistical decision theory.
|
650 |
|
6 |
|a Théorie de la décision bayésienne.
|
650 |
|
7 |
|a MATHEMATICS
|x Probability & Statistics
|x Bayesian Analysis.
|2 bisacsh
|
650 |
|
7 |
|a Bayesian statistical decision theory.
|2 fast
|0 (OCoLC)fst00829019
|
776 |
0 |
8 |
|i Print version:
|a Lee, Peter M.
|t Bayesian statistics.
|b 4th ed.
|d Chichester, West Sussex ; Hoboken, N.J. :, 2012
|z 9781118166406
|w (DLC) 2012007007
|
856 |
4 |
0 |
|u https://learning.oreilly.com/library/view/~/9781118359778/?ar
|z Texto completo (Requiere registro previo con correo institucional)
|
938 |
|
|
|a Coutts Information Services
|b COUT
|n 23991099
|
938 |
|
|
|a ebrary
|b EBRY
|n ebr10570748
|
938 |
|
|
|a EBSCOhost
|b EBSC
|n 463079
|
938 |
|
|
|a YBP Library Services
|b YANK
|n 8855527
|
994 |
|
|
|a 92
|b IZTAP
|