Because the Gaussian distribution is the exponential of squared parameters, the proof of its theorem for adaptation is a rather simple matter, which should be understandable at the high school level. Because the theorem is valid for all Gaussians and all regions of acceptability (even probability functions) it is in principle sufficient to see the proof for a Gaussian with variance = 1 in a single parameter. The proof is easily extended to an arbitrary number of parameters.
Some definitions are necessary:
Let s(x) be the probability that the individual having the parameter value x will be selected as a parent to new individuals in the progeny. Further
N(m – x) = C exp[ – (m – x)^2 / 2 ]
is a Gaussian probability density function of the parameter x. C is a constant such that the integral { N(m – x) dx } = 1. m is the centre of gravity of N.
Then
P(m) = integral { s(x) N(m – x) dx }
is a mean of individual fitness (mean fitness) over the population. It is valid for a population with an infinite number of individuals, but in a population with millions of individuals the approximation may be fairly good.
Let us now find out under what circumstances P becomes maximal. Thus, we may calculate the derivative, dP(m)/dm, of P with respect to m and put it equal to zero. Because we are allowed to differentiate inside the integral, and because the derivative of the exponential is equal to the exponential itself, we get
dP(m)/dm = – integral { (m – x) s(x) N(m – x) dx } =
= integral { x s(x) N(m – x) dx } – m integral { ( s(x) N(m – x) dx } =
= integral { x s(x) N(m – x) dx } – m P =
= P [ integral { x s(x) N(m – x) dx }/ P – m ]
= P ( m* – m ) = 0.
where we have introduced the centre of gravity, m*, over the set of selected parents.
m* = integral { x s(x) N(m – x) dx }/ integral { s(x) N(m – x) dx } =
= integral { x s(x) N(m – x) dx }/ P.
The proof is fairly easily extended to an arbitrary number of parameters with variance = 1.
As long as P is > 0 (otherwise we are extinct), a necessary condition for P to be maximal is that m* becomes equal to m in a state of selective equilibrium. The Hardy-Weinberg law (see previous blog) may push evolution in this direction by making m = m* in every generation (non-overlapping).
Suppose now that m is in a position making P maximal. If m is slightly moved in an arbitrary direction, then P will decrease but may be recovered if the variance is slightly decreased, which is the same as decreasing the disorder of N. Therefore we may say that the mean fitness and the disorder are simultaneously maximal when m* = m.
Unfortunately the concept of disorder has an unpleasant ring for most people. But observing that this concept is synonymous to average information and biological diversity, the ring seems more pleasant. The disorder becomes even more pleasant when it turns out that the probability of finding higher peaks in the phenotypic landscape increases. Thus, the disorder is of crucial importance to survival. This disorder also means diversity within the same species, and that individuals will differ more from each other. But unfortunately, not everyone is delighted with this diversity, which may also give rise to different forms of racism.
Comments are closed.