1. Introduction
In 1930, Fisher [
1] stated his “fundamental theorem of natural selection” as follows:
The rate of increase in fitness of any organism at any time is equal to its genetic variance in fitness at that time.
Some tried to make this statement precise as follows:
The time derivative of the mean fitness of a population equals the variance of its fitness.
However, this is only true under very restrictive conditions, so a controversy was ignited.
An interesting resolution was proposed by Price [
2], and later amplified by Ewens [
3] and Edwards [
4]. We can formalize their idea as follows. Suppose we have
n types of self-replicating entity, and idealize the population of the
ith type as a positive real-valued function
. Suppose
where the fitness
is a differentiable function of the populations of every type of replicator. The mean fitness at time
t is
where
is the fraction of replicators of the
ith type:
By the product rule, the rate of change of the mean fitness is the sum of two terms:
The
first of these two terms equals the variance of the fitness at time
t. We give the easy proof in Theorem 1. Unfortunately, the conceptual significance of this first term is much less clear than that of the total rate of change of mean fitness. Ewens concluded that “the theorem does not provide the substantial biological statement that Fisher claimed”.
However, there is another way out, based on an idea Fisher himself introduced in 1922: Fisher information [
5]. Fisher information gives rise to a Riemannian metric on the space of probability distributions on a finite set, called the ‘Fisher information metric’—or in the context of evolutionary game theory, the ‘Shahshahani metric’ [
6,
7,
8]. Using this metric we can define the speed at which a time-dependent probability distribution changes with time. We call this its ‘Fisher speed’. Under just the assumptions already stated, we prove in Theorem 2 that the Fisher speed of the probability distribution
is the variance of the fitness at time
t.
As explained by Harper [
9,
10], natural selection can be thought of as a learning process, and studied using ideas from information geometry [
11]—that is, the geometry of the space of probability distributions. As
changes with time, the rate at which information is updated is closely connected to its Fisher speed. Thus, our revised version of the fundamental theorem of natural selection can be loosely stated as follows:
As a population changes with time, the rate at which information is updated equals the variance of fitness.
The precise statement, with all the hypotheses, is in Theorem 2. However, one lesson is this: variance in fitness may not cause ‘progress’ in the sense of increased mean fitness, but it does cause change.
2. The Time Derivative of Mean Fitness
Suppose we have
n different types of entity, which we call
replicators. Let
or
for short, be the population of the
ith type of replicator at time
t, which we idealize as taking positive real values. Then a very general form of the
Lotka–Volterra equations says that
where
is the
fitness function of the
ith type of replicator. One might also consider fitness functions with explicit time dependence, but we do not do so here.
Let
, or
for short, be the probability at time
t that a randomly chosen replicator will be of the
ith type. More precisely, this is the fraction of replicators of the
ith type:
Using these probabilities, we can define the
mean fitness by
and the
variance in fitness by
These quantities are also functions of t, but we suppress the t dependence in our notation.
Fisher said that the variance in fitness equals the rate of change of mean fitness. Price [
2], Ewens [
3] and Edwards [
4] argued that Fisher only meant to equate
part of the rate of change in mean fitness to the variance in fitness. We can see this in the present context as follows. The time derivative of the mean fitness is the sum of two terms:
and as we now show, the
first term equals the variance in fitness.
Theorem 1. Suppose positive real-valued functions obey the Lotka–Volterra equations for some continuous functions . Then Proof. First we recall a standard formula for the time derivative
. Using the definition of
in Equation (
2), the quotient rule gives
where all sums are from 1 to
n. Using the Lotka–Volterra equations this becomes
where we write
to mean
, and similarly for
. Using the definition of
again, this simplifies to:
and thanks to the definition of mean fitness in Equation (
3), this reduces to the well-known
replicator equation:
Now, the replicator equation implies
On the other hand,
since
but also
. Subtracting Equation (
8) from Equation (
7) we obtain
or simply
The second term of Equation (
5) only vanishes in special cases, e.g., when the fitness functions
are constant. When the second term vanishes we have
This is a satisfying result. It says the mean fitness does not decrease, and it increases whenever some replicators are more fit than others, at a rate equal to the variance in fitness. However, we would like a more general result, and we can state one using a concept from information theory: the Fisher speed.
3. The Fisher Speed
While Theorem 1 allows us to express the variance in fitness in terms of the time derivatives of the probabilities , it does so in a way that also explicitly involves the fitness functions . We now prove a simpler formula for the variance in fitness, which equates it with the square of the ‘Fisher speed’ of the probability distribution .
The space of probability distributions on the set
is the
-simplex
The
Fisher metric is the Riemannian metric
g on the interior of the
-simplex such that given a point
p in the interior of
and two tangent vectors
we have
Here we are describing the tangent vectors
as vectors in
with the property that the sum of their components is zero: this makes them tangent to the
-simplex. We are demanding that
x be in the interior of the simplex to avoid dividing by zero, since on the boundary of the simplex we have
for at least one choice of
i.
If we have a time-dependent probability distribution
moving in the interior of the
-simplex as a function of time, its
Fisher speed is defined by
if the derivative
exists. This is the usual formula for the speed of a curve moving in a Riemannian manifold, specialized to the case at hand.
These are all the formulas needed to prove our result. However, for readers unfamiliar with the Fisher metric, a few words may provide some intuition. The factor of
in the Fisher metric changes the geometry of the simplex so that it becomes round, with the geometry of a portion of a sphere in
. But more relevant here is the Fisher metric’s connection to relative information—a generalization of Shannon information that depends on two probability distributions rather than just one [
12]. Given probability distributions
, the
information of q relative to p is
This is the amount of information that has been updated if one replaces the prior distribution
p with the posterior
q. So, sometimes relative information is called the ‘information gain’. It is also called ‘relative entropy’ or ‘Kullback–Leibler divergence’. It has many applications to biology [
9,
10,
13,
14].
Suppose
is a smooth curve in the interior of the
-simplex. We can ask the rate at which information is being updated as time passes. Perhaps surprisingly, an easy calculation gives
Thus, to first order, information is not being updated at all at any time
However, another well-known calculation (see, e.g., [
15]) shows that
So, to second order in , the square of the Fisher speed determines how much information is updated when we pass from to .
Theorem 2. Suppose positive real-valued functions obey the Lotka–Volterra equations for some continuous functions . Then the square of the Fisher speed of the probability distribution is the variance of the fitness: Proof. Consider the square of the Fisher speed
and use the replicator equation
obtaining
as desired. □
The generality of this result is remarkable. Formally,
any autonomous system of first-order differential equations
can be rewritten as Lotka–Volterra equations
simply by setting
In general is undefined when , but this not a problem if we restrict ourselves to situations where all the populations are positive; in these situations, Theorems 1 and 2 apply.