The Bactra Review: Occasional and eclectic book reviews by Cosma Shalizi   151

# Stochastic Processes for Physicists

## by Kurt Jacobs

Cambridge University Press, 2010

#### Aimed at the Right Target

A slightly different version of this review ran in Physics Today, 63:12 (December 2010): 66
Some years ago, the great mathematical physicist R. F. Streater began a review paper on "Classical and Quantum Probability" by complaining "The are few mathematical topics that are as badly taught to physicists as probability theory. Maxwell, Boltzmann and Gibbs were using probabilistic methods long before the subject was properly established as mathematics," but nonetheless it is their language and mathematical apparatus that is still taught, and used. This is unfortunate, considering the great utility of stochastic models of physical systems, and, at a more purely theoretical level, the very deep and sometimes surprising connections among dynamical systems, statistical mechanics, and the asymptotic "laws of large numbers" and "large deviations principles" governing random phenomena. Improvements to this part of physics pedagogy are therefore welcome.

This is where Jacob's book comes in. He deals with stochastic processes which come from adding randomly-varying noise terms into equations of motion, especially differential equations. Suppose, for example, that we're modeling the response of the bulk magnetization in a magnet after an external magnetic field has been shut off. There will be a deterministic component to the relaxation, reasonably modeled with an ordinary differential equation. But the actual magnetization of the material will be a sum over the magnetic fields of a large number of individual atoms, sharing energy thermally with each other and with the environment, and so we expect fluctuations around the deterministic relaxation. Assuming the magnet is far from the critical point, correlations in fluctuations should be short-range, so distant (on a molecular length-scale) parts of the magnet should be nearly de-correlated, and then the central limit theorem leads us to expect Gaussian fluctuations; one could tell a similar story based on the Einstein formula from statistical mechanics. Those fluctuations are not just measurement errors on our part --- the material really is becoming more or less magnetized. To model the dynamics, then, we need to somehow add terms representing Gaussian fluctuations into differential equations. Just as we solve ordinary differential equations by integration, we need to solve these stochastic differential equations by doing stochastic integrals.

The first person to completely make sense of stochastic integrals, at least with Gaussian white noise as the driving process, was the probabilist Kiyoshi Ito, and the rules he worked out for manipulating them --- not quite the same as those for ordinary integrals --- are called "Ito calculus". After two chapters of set-up, the core of Jacobs's book (chs. 3--6) introduces the Ito calculus and stochastic differential equations, discusses some properties of their solutions, and looks at applications and numerical methods.

Stochastic differential equations have as their solutions entire trajectories, just like ordinary differential equations --- in the example, "paths" of the magnetization as a function of time. If we were interested only in probability of a certain value of magnetization as a function of time, and not how the magnet got into that state, we could replace the stochastic differential equation with a linear partial differential equation for the probability distribution, the "Fokker-Planck equation"; this is the subject of chapter 7. It also has an excursion into deterministic models of pattern formation; stochastic partial differential equations are not touched on.

Gaussian white noise is not the only possible driving noise, and the Ito calculus is a special case of a more general family of stochastic calculi that invoke different driving noises. An important case is when the noise is not applied continuously but at discrete, random points in time (Poisson noise; chapter 8), or has non-Gaussian characteristics (Lévy noise; chapter 9).

There are great merits to all these chapters. Jacobs is an enthusiastic, clear, and concise writer. He is not afraid to tackle the whole field of stochastic differential equations as a fairly unified whole, which it is, rather than a collection of special cases. All theory is presented by means of heuristic arguments and calculations. The examples, while not treated in depth because of space, are pedagogically valuable.

This brings me to the weaknesses. Every process Jacobs discusses is a "Markov" process: its past is irrelevant to its future, given its current state. (Oddly, he never uses this term.) There is a powerful theory of the asymptotic behavior of Markov processes, going back to Markov himself, and it would have been nice to include some of that, and perhaps more general ergodic theory. It would also have been good to see some mention of non-Markovian processes. (The most natural way to get something non-Markovian is to take a Markov process but measure its state imperfectly --- observational noise or just limited resolution will do. Past observations usually are relevant in such "hidden" Markov models, because they help estimate the true current state.) Arguably, this is just a matter of taste.

More serious are mis-statements about some basic points of probability theory. For instance, Jacobs repeatedly says that the sum of two random variables has a distribution which is the convolution of their distributions; this is only true for independent variables. When he says this, and uses the result, the variables he's adding happen to be independent, but it would have been trivial to get it right. Similarly, the remarks on p. 30 in chapter 3 on how "for the purposes of obtaining a stochastic differential equation that describes real systems driven by noise" we are "forced" (his italics) to use Gaussian noise are contradicted by the chapters on Poisson and Lévy noise!

There is finally a strange animus against probability theory, as used by mathematicians, statisticians, economists, and control engineers. Chapter 10 introduces the jargon of modern probability theory, based on measure theory, but in a spirit of prophylaxis against infection rather than providing information. (This is in striking contrast to, say, Lasota and Mackey's excellent Chaos, Fractals, and Noise: Stochastic Aspects of Dynamics.) The "modern" approach was introduced about eighty years ago by figures like Norbert Wiener, Andrei Kolmogorov and Paul Lévy, precisely to handle physically motivated problems, like continuous fluctuations, in a coherent way. Mathematicians and statisticians swiftly abandoned the kind of probability theory still taught in physics courses because the new variety is much more powerful and clear, and the actual math is not that hard. (Having both done a Ph.D. in theoretical physics, and learned, used and taught measure-theoretic probability, I am quite sure about this.) There are plenty of books on group theory (say) for physicists, and quite properly so --- they have different needs and interests than mathematicians --- but none of the ones I've seen end by saying, in effect, "What the algebraists have to say about groups is useless, so don't bother with their ideas."

To sum up, this book has real merits, but I cannot recommend it unreservedly for self-study. It could be very useful with a teacher who can correct some of its odd swerves, and use it as a solid part of a class on stochastic models in physics.

188 pp., diagrams, a few black and white photos, index, bibliography

Hardback, ISBN 978-0-521-76542-8

15 December 2010