There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alternative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path. To illustrate the authors' frame of reference, consider the two definitions they give of conditional expectation. The first is as a projection of L2 spaces. The authors rely on the reader to be familiar with Hilbert space operators and at a glance, the connection to probability may not be not apparent. Subsequently, there is a discusssion of Bayes's rule and other relevant probabilistic concepts that lead to a definition of conditional expectation as an adjustment of random outcomes from a finer to a coarser information set.
Comprised of 37 chapters, this volume begins by presenting two remarks related to the result due to Kolmogorov: the first is a theorem holding for nonnegative definite functions from T X T to C (where T is an arbitrary index set), and the second applies to separable Hausdorff spaces T, continuous nonnegative definite functions ? from T X T to C, and separable Hilbert spaces H. The reader is then introduced to the extremal structure of the range of a controlled vector measure ? with values in a Hausdorff locally convex space X over the field of reals; how the theory of vector measures is connected with the theory of compact and weakly compact mappings on certain function spaces; and Daniell and Daniell-Bochner type integrals. Subsequent chapters focus on the disintegration of measures and lifting; products of spectral measures; and mean convergence of martingales of Pettis integrable functions.
This book should be of considerable use to workers in the field of mathematics.