Mathematical Statistics: Edition 2

Springer Science & Business Media
1
Free sample

This graduate textbook covers topics in statistical theory essential for graduate students preparing for work on a Ph.D. degree in statistics. The first chapter provides a quick overview of concepts and results in measure-theoretic probability theory that are usefulin statistics. The second chapter introduces some fundamental concepts in statistical decision theory and inference. Chapters 3-7 contain detailed studies on some important topics: unbiased estimation, parametric estimation, nonparametric estimation, hypothesis testing, and confidence sets. A large number of exercises in each chapter provide not only practice problems for students, but also many additional results. In addition to the classical results that are typically covered in a textbook of a similar level, this book introduces some topics in modern statistical theory that have been developed in recent years, such as Markov chain Monte Carlo, quasi-likelihoods, empirical likelihoods, statistical functionals, generalized estimation equations, the jackknife, and the bootstrap. Jun Shao is Professor of Statistics at the University of Wisconsin, Madison. Also available: Jun Shao and Dongsheng Tu, The Jackknife and Bootstrap, Springer- Verlag New York, Inc., 1995, Cloth, 536 pp., 0-387-94515-6.
Read more
Collapse
3.0
1 total
Loading...

Additional Information

Publisher
Springer Science & Business Media
Read more
Collapse
Published on
Feb 3, 2008
Read more
Collapse
Pages
592
Read more
Collapse
ISBN
9780387217185
Read more
Collapse
Read more
Collapse
Best For
Read more
Collapse
Language
English
Read more
Collapse
Genres
Mathematics / Applied
Mathematics / Probability & Statistics / General
Mathematics / Probability & Statistics / Stochastic Processes
Read more
Collapse
Content Protection
This content is DRM protected.
Read more
Collapse

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
The first Pannonian Symposium on Mathematical Statistics was held at Bad Tatzmannsdorf (Burgenland/Austria) from September 16th to 21st, 1979. The aim of it was to furthe~ and intensify scientific cooperation in the Pannonian area, which, in a broad sense, can be understood to cover Hungary, the eastern part of Austria, Czechoslovakia, and parts of Poland, ¥ugoslavia and Romania. The location of centers of research in mathematical statistics and probability theory in this territory has been a good reason for the geographical limitation of this meeting. About 70 researchers attended this symposium, and 49 lectures were delivered; a considerable part of the presented papers is collected in this volume. Beside the lectures, vigorous informal discussions among the participants took place, so that many problems were raised and possible ways of solutions were attacked. We take the opportunity to thank Dr. U. Dieter (Graz), Dr. F. Konecny (Wien), Dr. W. Krieger (G8ttingen) and Dr. E. Neuwirth (Wien) for their valuable help in the refereeing work for this volume. The Pannonian Symposium could not have taken place without the support of several institutions: The Austrian Ministry for Research and Science, the State government of Burgenland, the Community Bad Tatzmannsdorf, the Kurbad Tatzmannsdorf AG, the Austrian Society for Information Science and Statistics, IBM Austria, Volksbank Oberwart, Erste Osterreichische Spar-Casse and Spielbanken AG Austria. The Austrian Academy of Sciences iv made possible the participation in the Symposium for several mathematicians. We express our gratitude to all these institutions for their generous help.
The jackknife and bootstrap are the most popular data-resampling meth ods used in statistical analysis. The resampling methods replace theoreti cal derivations required in applying traditional methods (such as substitu tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples. Because of the availability of inexpensive and fast computing, these computer-intensive methods have caught on very rapidly in recent years and are particularly appreciated by applied statisticians. The primary aims of this book are (1) to provide a systematic introduction to the theory of the jackknife, the bootstrap, and other resampling methods developed in the last twenty years; (2) to provide a guide for applied statisticians: practitioners often use (or misuse) the resampling methods in situations where no theoretical confirmation has been made; and (3) to stimulate the use of the jackknife and bootstrap and further devel opments of the resampling methods. The theoretical properties of the jackknife and bootstrap methods are studied in this book in an asymptotic framework. Theorems are illustrated by examples. Finite sample properties of the jackknife and bootstrap are mostly investigated by examples and/or empirical simulation studies. In addition to the theory for the jackknife and bootstrap methods in problems with independent and identically distributed (Li.d.) data, we try to cover, as much as we can, the applications of the jackknife and bootstrap in various complicated non-Li.d. data problems.
An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform.

Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

©2019 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google|Location: United StatesLanguage: English (United States)
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.