A pocketful of theorems makes it plausible to use Gaussian adaptation as a simple second order statistical model of the evolution of quantitative traits provided that those traits are Gaussian distributed, or nearly so. The scientific community does not accept this opinion, but nobody has thus far told me that any one of the theorems – I refer to – is wrong or that it can’t be applied to evolution.As earlier shown, Gaussian adaptation, GA, may be used for maximization of manufacturing yield. The biological analogy to technical manufacturing yield becomes mean fitness. And a plausible definition of mean fitness, P, as a mean of probabilities is

P = integral{ s(x) N(m – x) dx }

where s(x) is the probability that the individual having the array of n quantitative (Gaussian distributed) traits x(i), i = 1, 2, …, n. N is the Gaussian probability density function, p.d.f., with mean = m. It may be that this definition is not very suitable for breeding programs. Nevertheless, it seems very useful in many philosophical discussions.

A pocketful of theorems makes it plausible to use Gaussian adaptation as a simple second order statistical model of the evolution of quantitative traits provided that those traits are Gaussian distributed, or nearly so. The scientific community does not accept this opinion, but nobody has thus far told me that any one of the theorems – I refer to – is wrong or that it can’t be applied to evolution.

Together those theorems shows a duality between mean fitness and average information ( phenotypic disorder, diversity) and that evolution may carry out a simultaneous maximization of mean fitness and average information. Also meaning that the process gives more information in the art of survival.

According to point 7 below there must also be a balance between order and disorder obtained by a heritable mutation rate such that P is kept at a suitable level. In such a case evolution may maximize average information while keeping mean fitness constant.

1. The central limit theorem: Sums of a large number of random steps tend to become Gaussian distributed.

Since the development from fertilized egg to adult individual may be seen as a modified recapitulation of the stepwise evolution of a particular individual, morphological characters (parameters x) tend to become Gaussian distributed. As examples of such parameters we may mention the length of a bone or the distance between the pupils, or even the IQ.

2. The Hardy-Weinberg law: If mating takes place at random, then the allele frequencies in the next generation are the same as they were for the parents. Thus, the centre of gravity of phenotypes of offspring coincides with the centre of phenotypes of the parents.

3. Definitions of average information and phenotypic disorder, diversity, H – are equivalent and are valid for all statistical frequency functions, p(i) , (i = 1, 2, …, n). Sum{ p(i) } = 1.

H = sum{ p(i) log[p(i)] }.

4. The second law of thermodynamics (the entropy law): The disorder will always increase in all isolated systems.

But in order to avoid considering isolated systems I prefer an alternative formulation: A system attains its possible macro states in proportion to their probability of occurrence. Then, the most probable states are the most disordered.

5. A theorem about disorder: The normal distribution is the most disordered distribution among all statistical distributions having the same moment matrix, M.

6. A more general formulation of the theorem of Gaussian adaptation:

a. The gradient of the mean fitness of a normal p. d. f. with respect to m is equal to

grad’m P(m) = P inverse(M) ( m* – m).

The maximizing necessary condition for mean fitness is m* = m (at selective equilibrium).

m* is the centre of gravity of the phenotypes of the parents.

b. The gradient of phenotypic disorder (entropy, average information, diversity) with respect to m – assuming P constant – points in the same direction as grad’mP(m).

c. A Gaussian p.d.f. may be adapted for maximum phenotypic disorder to any s(x) at any given value of P. The maximum necessary conditions are:

m* = m and M* proportional to M.

When m* = m at selective equilibrium, as achieved according to point 2, the gradient = 0 and mean fitness and average information (phenotypic disorder, diversity) may be simultaneously maximal.

For the proof see Kjellström & Taxén, 1981 in reference list

http://en.wikipedia.org/wiki/Gaussian_adapation#references

7. The theorem of efficiency. All measures of efficiency satisfying certain simple relevant postulates, are asymptotically proportional to -P*log(P) when the number of statistically independent parameters tend towards infinity.

The most important difference between the natural and the simulated evolution in my PC is that the natural one is able to test millions of individuals in parallel, while my PC has to test one at a time. This means that when evolution replaces one generation of a population with one million individuals with a new one in one year, the same operation will take one million years in my PC. In spite of this I find the simulated evolution very efficient.

As earlier shown, maximum efficiency is achieved when P = 1/e = 0.37.

For the proof see Kjellström, 1991, in reference list

http://en.wikipedia.org/wiki/Gaussian_adapation#references

Gkm

Comments are closed.