|
|
|
|
LEADER |
00000cam a2200000Mu 4500 |
001 |
EBOOKCENTRAL_ocn961063598 |
003 |
OCoLC |
005 |
20240329122006.0 |
006 |
m o d |
007 |
cr |n|---||||| |
008 |
161112s2016 nju o 000 0 eng d |
040 |
|
|
|a EBLCP
|b eng
|e pn
|c EBLCP
|d OCLCQ
|d CHVBK
|d OCLCO
|d OCLCF
|d IDB
|d OCLCQ
|d OCLCO
|d MERUC
|d ZCU
|d ICG
|d OCLCQ
|d TKN
|d ESU
|d DKC
|d AU@
|d OCLCQ
|d UKAHL
|d OCL
|d UKMGB
|d HF9
|d OCLCQ
|d OCLCO
|d OCLCL
|
015 |
|
|
|a GBB6F3917
|2 bnb
|
016 |
7 |
|
|a 018096012
|2 Uk
|
020 |
|
|
|a 9781119237075
|
020 |
|
|
|a 1119237076
|
020 |
|
|
|a 9781119237099
|q (ePub ebook)
|
020 |
|
|
|a 1119237092
|
020 |
|
|
|z 9781119237068
|q (hbk.)
|
029 |
1 |
|
|a CHNEW
|b 000901705
|
029 |
1 |
|
|a UKMGB
|b 018096012
|
035 |
|
|
|a (OCoLC)961063598
|
037 |
|
|
|a 9781119237099
|b Wiley
|
050 |
|
4 |
|a QA273.Z436 2017
|
082 |
0 |
4 |
|a 519.5
|
049 |
|
|
|a UAMI
|
100 |
1 |
|
|a Zhang, Zhiyi.
|
245 |
1 |
0 |
|a Statistical Implications of Turing's Formula.
|
260 |
|
|
|a Newark :
|b Wiley,
|c 2016.
|
300 |
|
|
|a 1 online resource (299 pages)
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
588 |
0 |
|
|a Print version record.
|
520 |
8 |
|
|a This volume features a broad introduction to recent research on Turing's formula and presents modern applications in statistics, probability, information theory and other areas of modern data science. It presents a clear introduction to Turing's formula and its connections to statistics.
|b Features a broad introduction to recent research on Turing's formula and presents modern applications in statistics, probability, information theory, and other areas of modern data scienceTuring's formula is, perhaps, the only known method for estimating the underlying distributional characteristics beyond the range of observed data without making any parametric or semiparametric assumptions. This book presents a clear introduction to Turing's formula and its connections to statistics. Topics with relevance to a variety of different fields of study are included such as information theory; statistics; probability; computer science inclusive of artificial intelligence and machine learning; big data; biology; ecology; and genetics. The author provides examinations of many core statistical issues within modern data science from Turing's perspective. A systematic approach to long-standing problems such as entropy and mutual information estimation, diversity index estimation, domains of attraction on general alphabets, and tail probability estimation is presented in light of the most up-to-date understanding of Turing's formula. Featuring numerous exercises and examples throughout, the author provides a summary of the known properties of Turing's formula and explains how and when it works well; discusses the approach derived from Turing's formula in order to estimate a variety of quantities, all of which mainly come from information theory, but are also important for machine learning and for ecological applications; and uses Turing's formula to estimate certain heavy-tailed distributions. In summary, this book: Features a unified and broad presentation of Turing's formula, including its connections to statistics, probability, information theory, and other areas of modern data science Provides a presentation on the statistical estimation of information theoretic quantities Demonstrates the estimation problems of several statistical functions from Turing's perspective such as Simpson's indices, Shannon's entropy, general diversity indices, mutual information, and Kullback-Leibler divergence Includes numerous exercises and examples throughout with a fundamental perspective on the key results of Turing's formulaStatistical Implications of Turing's Formula is an ideal reference for researchers and practitioners who need a review of the many critical statistical issues of modern data science. This book is also an appropriate learning resource for biologists, ecologists, and geneticists who are involved with the concept of diversity and its estimation and can be used as a textbook for graduate courses in mathematics, probability, statistics, computer science, artificial intelligence, machine learning, big data, and information theory. Zhiyi Zhang, PhD, is Professor of Mathematics and Statistics at The University of North Carolina at Charlotte. He is an active consultant in both industry and government on a wide range of statistical issues, and his current research interests include Turing's formula and its statistical implications; probability and statistics on countable alphabets; nonparametric estimation of entropy and mutual information; tail probability and biodiversity indices; and applications involving extracting statistical information from low-frequency data space. He earned his PhD in Statistics from Rutgers University.
|
505 |
0 |
|
|a Contents -- Dedication -- -- Preface -- Chapter 1: Turing's formula -- 1.1 Turing's Formula -- 1.2 Univariate Normal Laws -- 1.3 Multivariate Normal Laws -- 1.4 Turing's formula Augmented -- 1.5 Goodness-of -- t by Counting Zeros -- 1.6 Remarks -- 1.7 Exercises -- Chapter 2: Estimation of Simpson's indexes -- 2.1 Generalized Simpson's indexes -- 2.2 Estimation of Simpson's indexes -- 2.3 Normal Laws -- 2.4 Illustrative Examples -- 2.5 Remarks -- 2.6 Exercises -- Chapter 3: Estimation of Shannon's entropy -- 3.1 A Brief Overview -- 3.2 The Plug-in Entropy Estimator -- 3.2.1 When K is Finite -- 3.2.2 When K is Countably Infinite -- 3.3 Entropy Estimator in Turing's Perspective -- 3.3.1 When K is Finite -- 3.3.2 When K is Countably Infinite -- 3.4 Appendix -- 3.4.1 Proof of Lemma 3.2 -- 3.4.2 Proof of Lemma 3.5 -- 3.4.3 Proof of Corollary 3.5 -- 3.4.4 Proof of Lemma 3.14 -- 3.4.5 Proof of Lemma 3.18 -- 3.5 Remarks -- 3.6 Exercises -- Chapter 4: Estimation of Diversity indexes -- 4.1 A Unified Perspective on Diversity indexes -- 4.2 Estimation of Linear Diversity indexes -- 4.3 Estimation of Renyi's Entropy -- 4.4 Remarks -- 4.5 Exercises -- Chapter 5: Estimation of Information -- 5.1 Introduction -- 5.2 Estimation of Mutual Information -- 5.2.1 The Plug-in Estimator -- 5.2.2 Estimation in Turing's Perspective -- 5.2.3 Estimation of Standardized Mutual Information -- 5.2.4 An Illustrative Example -- 5.3 Estimation of Kullback-Leibler Divergence -- 5.3.1 The Plug-in Estimator -- 5.3.2 Estimation in Turing's Perspective -- 5.3.3 Symmetrized Kullback-Leibler Divergence -- 5.4 Tests of Hypotheses -- 5.5 Appendix -- 5.5.1 Proof of Theorem 5.12 -- 5.6 Exercises -- Chapter 6: Domains of Attraction on Countable Alphabets -- 6.1 Introduction -- 6.2 Domains of Attraction -- 6.3 Examples and Remarks -- 6.4 Appendix -- 6.4.1 Proof of Lemma 6.3 -- 6.4.2 Proof of Theorem 6.2 -- 6.4.3 Proof of Lemma 6.6 -- 6.5 Exercises -- Chapter 7: Estimation of Tail Probability -- 7.1 Introduction -- 7.2 Estimation of Pareto Tail -- 7.3 Statistical Properties of AMLE -- 7.4 Remarks -- 7.5 Appendix -- 7.5.1 Proof of Lemma 7.7 -- 7.5.2 Proof of Lemma 7.9 -- 7.6 Exercises -- Appendix -- Bibliography -- Index
|
590 |
|
|
|a ProQuest Ebook Central
|b Ebook Central Academic Complete
|
650 |
|
0 |
|a Mathematical statistics
|v Textbooks.
|
650 |
|
0 |
|a Probabilities
|v Textbooks.
|
650 |
|
7 |
|a Mathematical statistics
|2 fast
|
650 |
|
7 |
|a Probabilities
|2 fast
|
655 |
|
7 |
|a Textbooks
|2 fast
|
758 |
|
|
|i has work:
|a Statistical implications of Turing's formula (Text)
|1 https://id.oclc.org/worldcat/entity/E39PCFMCDJVHQtTDqCHkg89mBd
|4 https://id.oclc.org/worldcat/ontology/hasWork
|
776 |
0 |
8 |
|i Print version:
|a Zhang, Zhiyi.
|t Statistical Implications of Turing's Formula.
|d Newark : Wiley, ©2016
|z 9781119237068
|
856 |
4 |
0 |
|u https://ebookcentral.uam.elogim.com/lib/uam-ebooks/detail.action?docID=4718308
|z Texto completo
|
938 |
|
|
|a Askews and Holts Library Services
|b ASKH
|n AH30712640
|
938 |
|
|
|a EBL - Ebook Library
|b EBLB
|n EBL4718308
|
994 |
|
|
|a 92
|b IZTAP
|