- Princeton University Press
Ten Great Ideas about Chance
Key Metrics
- Persi Diaconis
- Princeton University Press
- Hardcover
- 9780691174167
- 9.4 X 6.5 X 1.1 inches
- 1.35 pounds
- Mathematics > Probability & Statistics - General
- English
Book Description
A fascinating account of the breakthrough ideas that transformed probability and statistics
In the sixteenth and seventeenth centuries, gamblers and mathematicians transformed the idea of chance from a mystery into the discipline of probability, setting the stage for a series of breakthroughs that enabled or transformed innumerable fields, from gambling, mathematics, statistics, economics, and finance to physics and computer science. This book tells the story of ten great ideas about chance and the thinkers who developed them, tracing the philosophical implications of these ideas as well as their mathematical impact.
Persi Diaconis and Brian Skyrms begin with Gerolamo Cardano, a sixteenth-century physician, mathematician, and professional gambler who helped develop the idea that chance actually can be measured. They describe how later thinkers showed how the judgment of chance also can be measured, how frequency is related to chance, and how chance, judgment, and frequency could be unified. Diaconis and Skyrms explain how Thomas Bayes laid the foundation of modern statistics, and they explore David Hume's problem of induction, Andrey Kolmogorov's general mathematical framework for probability, the application of computability to chance, and why chance is essential to modern physics. A final idea--that we are psychologically predisposed to error when judging chance--is taken up through the work of Daniel Kahneman and Amos Tversky.
Complete with a brief probability refresher, Ten Great Ideas about Chance is certain to be a hit with anyone who wants to understand the secrets of probability and how they were discovered.
Author Bio
I am a mathematician and statistician working in probability, combinatorics, and group theory with a focus on applications to statistics and scientific computing. A specialty is rates of convergence of Markov chains. I am currently interested in trying to adapt the many mathematical developments to say something useful to practitioners in large real-world simulations.
Research Interests
Many of my publications during the past three years have focused on rates of convergence of Markov chains to their stationary distributions. This is an important part of applied probability and scientific computing. I am particularly pleased by (a) results with Phillip Wood showing that most birth and death chains on {0,1,...} do not show a sharp cut-off; (b) results with Jason Fulman and Susan Holmes giving a careful analysis of casino shuffling machines; (c) results with Sourav Chatterjee making new contributions to a practical physics problem, Bose-Einstein condensation.
I think I opened up two new areas in the study of Markov chains: rates of convergence to quasi-stationarity and the study of "features''. For the first, a host of absorbing Markov chains appear in genetics, biology, and queuing. These have quasi-stationary distributions: given that the chain has not been absorbed by time T, where is it likely to be? We ask about quantitative versions, how large does T have to be to make the asymptotics useful? The many tools available for ergodic chains need to be completely revised. These papers make a start at revising the geometric theory (Poincare, Cheeger, Nash, log Sobolev). For the second area, "features", often researchers don't care about all aspects of a chain, but are only interested in a few features; then the rates of convergence can change. I proved such things for riffle shuffling but now see how to do things for other chains.
In addition to my work on Markov chains, I have completed a number of statistical projects. One of the main ones is for the statistical analysis of graph and network data. Working with Sourav Chatterjee, Svante Janson, and Susan Holmes we built a theory to allow analysis of familiar exponential models. These can have surprising properties: sometimes N parameters can be accurately estimated based on a sample of size one and sometimes a large amount of data can still lead to inconsistent estimators. This work opened up the connection between statistics and the emerging area of graph limit theory. In turn, that connection has led to a torrent of follow-up work, conferences, and a slew of real-world applications.
One key topic in my work has been generalizations of de Finetti's notion of exchangeability. The graph work above leans on the connections I made between graph limit theory and the Aldous-Hoover theorem. In a different direction, working with Sergio Bacallado and Susan Holmes, I managed to develop a practical theory of "almost exchangeability'' and apply it to some biological problems. I am busy following this up with some more theoretical work on de Finetti-style representation theorems for approximate exchangeable data.
A very different statistical development is represented by my work with Bob Griffiths developing bivariate distributions for data with binomial or multinomial models, and with Bailey, et al. which combines my group-theoretic methods for analysis of designed experiments with more classical approaches of Bailey and Nelder's "general balance''. It is surprising that these important problems haven't been seriously treated to date. Finally, work with my students Bhaswar Bhattacharya and Sumit Mukherjee on generalizations of the birthday problem to random graphs has direct application to Friedman-Rafsky two-sample tests.
Source: Stanford University
Videos
Community reviews
Write a ReviewNo Community reviews