An Introduction to Benford's Law

Princeton University Press
Free sample

This book provides the first comprehensive treatment of Benford's law, the surprising logarithmic distribution of significant digits discovered in the late nineteenth century. Establishing the mathematical and statistical principles that underpin this intriguing phenomenon, the text combines up-to-date theoretical results with overviews of the law’s colorful history, rapidly growing body of empirical evidence, and wide range of applications.

An Introduction to Benford’s Law begins with basic facts about significant digits, Benford functions, sequences, and random variables, including tools from the theory of uniform distribution. After introducing the scale-, base-, and sum-invariance characterizations of the law, the book develops the significant-digit properties of both deterministic and stochastic processes, such as iterations of functions, powers of matrices, differential equations, and products, powers, and mixtures of random variables. Two concluding chapters survey the finitely additive theory and the flourishing applications of Benford’s law.

Carefully selected diagrams, tables, and close to 150 examples illuminate the main concepts throughout. The text includes many open problems, in addition to dozens of new basic theorems and all the main references. A distinguishing feature is the emphasis on the surprising ubiquity and robustness of the significant-digit law. This text can serve as both a primary reference and a basis for seminars and courses.

Read more
Collapse

About the author

Arno Berger is associate professor of mathematics at the University of Alberta. He is the author of Chaos and Chance: An Introduction to Stochastic Aspects of Dynamics. Theodore P. Hill is professor emeritus of mathematics at the Georgia Institute of Technology and research scholar in residence at the California Polytechnic State University.
Read more
Collapse
Loading...

Additional Information

Publisher
Princeton University Press
Read more
Collapse
Published on
May 26, 2015
Read more
Collapse
Pages
256
Read more
Collapse
ISBN
9781400866588
Read more
Collapse
Read more
Collapse
Read more
Collapse
Language
English
Read more
Collapse
Genres
Mathematics / Applied
Mathematics / General
Mathematics / Logic
Mathematics / Probability & Statistics / General
Read more
Collapse
Content Protection
This content is DRM protected.
Read more
Collapse
Read Aloud
Available on Android devices
Read more
Collapse
Eligible for Family Library

Reading information

Smartphones and Tablets

Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.

Laptops and Computers

You can read books purchased on Google Play using your computer's web browser.

eReaders and other devices

To read on e-ink devices like the Sony eReader or Barnes & Noble Nook, you'll need to download a file and transfer it to your device. Please follow the detailed Help center instructions to transfer the files to supported eReaders.
Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking came to be, and its rise to primacy in the nineteenth and early twentieth centuries. Additionally, it considers how seeing the world through a quantitative lens has shaped our perception of the world we live in, and explores the lives of the individuals behind its early establishment. This worldview was unlike anything humankind had before, and it came about because of a momentous human achievement: we had learned how to measure uncertainty. Probability as a science was conceptualised. As a result of probability theory, we now had correlations, reliable predictions, regressions, the bellshaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments happened during a relatively short period in world history— roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. At which time, transportation had advanced rapidly, due to the invention of the steam engine, and literacy rates had increased exponentially. This brief period in time was ready for fresh intellectual activity, and it gave a kind of impetus for the probability inventions. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances; in the Bayesian logic of artificial intelligence, as well as applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of quantitative thinking. The Error of Truth tells its story— when, why, and how it happened.
©2019 GoogleSite Terms of ServicePrivacyDevelopersArtistsAbout Google|Location: United StatesLanguage: English (United States)
By purchasing this item, you are transacting with Google Payments and agreeing to the Google Payments Terms of Service and Privacy Notice.