Rough Beast

Rough Beast
Grifo Mecanico - Diego Mazzeo

Monday, June 11, 2012

Entropy and Evolution

From the always seductively intelligent Sean Carrol at Discovery.com

"Okay, sticking to my desire to blog rather than just tweet (we’ll see how it goes): here’s a great post by John Baez with the forbidding title “Information Geometry, Part 11.” But if you can stomach a few equations, there’s a great idea being explicated, which connects evolutionary biology to entropy and information theory.
There are really two points. The first is a bit of technical background you can ignore if you like, and skip to the next paragraph. It’s the idea of “relative entropy” and its equivalent “information” formulation. Information can be thought of as “minus the entropy,” or even better “the maximum entropy possible minus the actual entropy.” If you know that a system is in a low-entropy state, it’s in one of just a few possible microstates, so you know a lot about it. If it’s high-entropy, there are many states that look that way, so you don’t have much information about it. (Aside to experts: I’m kind of shamelessly mixing Boltzmann entropy and Gibbs entropy, but in this case it’s okay, and if you’re an expert you understand this anyway.) John explains that the information (and therefore also the entropy) of some probability distribution is always relative to some other probability distribution, even if we often hide that fact by taking the fiducial probability to be uniform (… in some variable). The relative information between two distributions can be thought of as how much you don’t know about one distribution if you know the other one; the relative information between a distribution and itself is zero.
The second point has to do with the evolution of populations in biology (or in analogous fields where we study the evolution of populations), following some ideas of John Maynard Smith. Make the natural assumption that the rate of change of a population is proportional to the number of organisms in that population, where the “constant” of proportionality is a function of all the other populations. That is: imagine that every member of the population breeds at some rate that depends on circumstances. Then there is something called an evolutionarily stable state, one in which the relative populations (the fraction of the total number of organisms in each species) is constant. An equilibrium configuration, we might say.
Then the take-home synthesis is this: if you are not in an evolutionarily stable state, then as your population evolves, the relative information between the actual state and the stable one decreases with time. Since information is minus entropy, this is a Second-Law-like behavior. But the interpretation is that the population is “learning” more and more about the stable state, until it achieves that state and knows all there is to know!
Okay, you can see why tweeting is seductive. Without the 140-character limit, it’s hard to stop typing, even if I try to just link and give a very terse explanation. Hopefully I managed to get all the various increasing/decreasing pointing in the right direction…"

No comments: