October 04, 2005

Schooled by Selection

Attention conservation notice: Promotes a technical preprint by some friends. I make no attempt to explain the science, owing to lack of time.

As you know, Bob, in the 1950s J. L. Kelly established that there were intimate connections between optimal strategies for repeated gambling and Shannon's information theory. (For instance, the best achievable growth rate for the gambler's wealth is set by the entropy rate of the random sequence of gambling outcomes.) As you know also know, Bob, the mathematical theory of natural selection is closely connected to that of repeated gambling (so that, e.g., John Holland's Adaptation in Natural and Artificial Systems is in some ways an extended treatise on multi-armed bandits.) This suggests that information theory could be useful in analyzing natural selection, and it would be natural to suppose that information about the environment should manifest itself as increased fitness somehow. There's been sporadic interest in the topic (e.g., J. B. S. Haldane, with his usual prescience had an early paper in this area), but really, in my humble opinion, not enough. By way of rectification, I submit the following for your favorable consideration:

Carl T. Bergstrom and Michael Lachmann, "The fitness value of information", q-bio.PE/0510007, submitted to PNAS
Abstract: Biologists measure information in different ways. Neurobiologists and researchers in bioinformatics often measure information using information-theoretic measures such as Shannon's entropy or mutual information. Behavioral biologists and evolutionary ecologists more commonly use decision-theoretic measures, such the value of information, which assess the worth of information to a decision maker. Here we show that these two kinds of measures are intimately related in the context of biological evolution. We present a simple model of evolution in an uncertain environment, and calculate the increase in Darwinian fitness that is made possible by information about the environmental state. This fitness increase -- the fitness value of information -- is a composite of both Shannon's mutual information and the decision-theoretic value of information. Furthermore, we show that in certain cases the fitness value of responding to a cue is exactly equal to the mutual information between the cue and the environment. In general the Shannon entropy of the environment, which seemingly fails to take anything about organismal fitness into account, nonetheless imposes an upper bound on the fitness value of information.
Edo Kussell and Stanislas Leibler, "Phenotypic Diversity, Population Growth, and Information in Fluctuating Environments", Science 309 (2005): 2075--2078
Abstract: Organisms in fluctuating environments must constantly adapt their behavior to survive. In clonal populations, this may be achieved through sensing followed by response or through the generation of diversity by stochastic phenotype switching. Here we show that stochastic switching can be favored over sensing when the environment changes infrequently. The optimal switching rates then mimic the statistics of environmental changes. We derive a relation between the long-term growth rate of the organism and the information available about its fluctuating environment.

Kussell and Leibler consider Markovian environments (technically, the environmental state is a semi-Markov process), and show that the fitness penalty paid for getting the statistics of environmental changes wrong is proportional to the relative entropy (Kullback divergence) rate between the organism's switch rates and the environments. Bergstrom and Lachmann consider only independent, identically-distributed environments, but go much further in relating the fitness value of signals about the environment to traditional information-theoretic quantities, essentially considering those signals as transmission channels. (They like thinking about the value of signals.) In both cases, my feeling is that, since the Kelly gambling results carry over to general ergodic environments (see the papers of Thomas Cover, especially the ones with Paul Algoet), the evolutionary results should too. I am not, however, volunteering to perform the extensions.

I happen to know that Bergstrom and Lachmann's work is part of a more general program investigating the role of information in evolution, because I've been bugging Michael to publish his results since I heard him talk about them at the first "Science et Gastronomie" workshop two years ago. I won't say any more, for fear of spoiling their surprises, except to say that further exciting revelations are close at hand.

Biology; Enigmas of Chance

Posted at October 04, 2005 21:34 | permanent link

Three-Toed Sloth