OK, you’ve been very patient.  All this talk of genetics, drift and populations and so far we haven’t seen any ‘survival of the fittest’.  Well, here goes.

The last version of the overlapping population model just randomly killed individuals with a 0.25 probability. By simply making this probability dependent on the trait on the individual we link survival to genetics which is the basis for Darwinian evolution.

So, I’ve slightly modified the mortality function in the ComplexPopulation<T> class to call a virtual method to calculate the probability that a specific individual dies:

        ///<summary> 
        /// Does one season of mortality. 
        ///</summary> 
        ///<remarks>Performs probability-based deletions of individuals in the population.</remarks> 
        public void DoOneSeasonOfMortality() 
        { 
            System.Random rand = new Random();   

            for (int i = 0; i < mPopulationSize; i++) 
            { 
                double chance = rand.NextDouble();   

                if (chance <= ProbabilityOfDeath(population[i])) 
                { 
                    // Push index into stack then delete individual. 
                    stackTop++; 
                    stack[stackTop] = i; 
                    population[i] = null; 
                } 
            } 
        }   

        ///<summary> 
        /// Calculates the probability that this individual dies 
        ///</summary> 
        ///<param name="individual">The individual.</param> 
        ///<returns>A value between 0 and 1</returns> 
        public virtual double ProbabilityOfDeath(IIndividual individual) 
        { 
            // 0.25 probability of death for each individual. 
            return 0.25; 
        }

 

This has no effect on the functionality of this class, but it enables us to create a new population class with a different mortality function:

///<summary> 
/// An overlapping population with trait-dependent mortality. 
///</summary> 
///<typeparam name="T">The <see cref="IIndividual"/>-based type in the population.</typeparam> 
class MortalityPopulation<T> : ComplexPopulation<T> where T : class, IIndividual, new() 
{ 
    public MortalityPopulation(int populationSize, int MaxTraitValue) 
        : base(populationSize, MaxTraitValue) 
    { 
    }   

    ///<summary> 
    /// Calculates mortality as a straight line relationship with trait value. 
    ///</summary> 
    ///<param name="individual">The <see cref="IIndividual"/>.</param> 
    ///<returns>A <see cref="double"/> between 0 and 1 representing the probability of mortality for this individual.</returns> 
    public override double ProbabilityOfDeath(IIndividual individual) 
    { 
        int traitValue = individual.TraitValue;   

        return ((double)traitValue / (double)base.mMaxTraitValue) / (double)10; 
    } 
}

 

All this derived class does is override the virtual method to provide a calculated survival probability.

The equation we’re using is charted as:

Chart of fitness functions.

We’ll come to the three ‘strength’ factors later – for the moment we’ll use the ‘default’ strength of 10. This means that individuals with a trait value of 20 (all AA alleles) have a 0.1 probability of death, while individual with a trait value of 0 (all aa alleles) have a probability of death of 0 (ie they live for ever!).

So what is the result of running a population through this simulation?

Average trait evolving over 1000 generations.

So there’s a marked drop in the average trait value and a gradual decrease in the standard deviation.  This is because the population’s distribution is being ‘squashed’ up against the edge of possible values.

If we look at the trait as a 3D chart of distribution over time we get:

3D image of trait evolving over time.

That’s a bit difficult to see, so here’s the distribution histograms over time:

TraitFitnessStandardHistogram

So you can see, firstly, that at generation 0 before any evolution has occurred, the trait is normally distributed in the population, but as our selective pressure takes hold (generation 50) the distribution stops being normal.

Imagine that the normal distribution is made of jelly.  Evolution is acting like a hand pushing the jelly to the left : while being pushed, the jelly distorts. This hand is the ‘selective pressure’ of evolution. The distribution is moving due to individuals dying and being born, not due to individuals changing while they live.

In this particular example, there is a wall in the way at trait value 0; there is nowhere left to evolve to.  Pushing the jelly against this wall makes it kind of go ‘splat’ against that wall, but the hand doesn’t stop pushing and it ends up holding the jelly against the wall.

If the wall wasn’t there and the hand stopped pushing, the jelly would return to it’s normal distribution shape.

From this we can draw the conclusion that measuring the distribution of a physical trait in a population might give us a clue as to whether it is under strong selective pressure – if the normal distribution is skewed it’s a clue that strong selective pressure is forcing the population to evolve.

Returning to our graph of mortality probability by trait value, we can turn the ‘strength’ of the selective pressure up or down.

In our code:

return ((double)traitValue / (double)base.mMaxTraitValue) / (double)10;

 

the value ’10’ is the strength of the selective pressure.  If we make it smaller, the selective pressure is stronger, if we make it larger the selective pressure is weaker.

Setting it to 1 gives the following change in mean trait:

TraitFitnessHighPressure

As you can see, the change in mean is sharper than before.  This is because more individuals are dying each year and so changes occur faster.

Setting the ‘strength’ factor to 25 gives the following:

TraitFitnessLowPressure

Which shows the changes taking place more slowly.

So there you have a simple evolutionary model based on ‘survival of the fittest’ Darwinian Evolution.

Next I shall show how Sexual Selection can also cause evolutionary change in place of survival of the fittest.

The code for this simulation can be found over in the Channel 9 sandbox.

 

If you’re interested in writing software, check out my other blog: Coding at The Coal Face

Advertisements

OK, time for some code.

First we have to represent a single genetic locus.  We are going to use the basic dominant/recessive allele model which shows three versions (alleles) of a specific locus: aa, aA & AA.  We’ll use an ‘enum’ for this one:

 ///<summary>

 /// Enum defining possible genotypes at a single locus.

 ///</summary>

 public enum locus_enum

 {

     aa=0,

     aA=1,

     AA=2

 }

(Apologies for the duff formatting, I can’t seem to get it right on-line).

Notice the numeric values assigned to each one.  We can read the numeric values as ‘the number of dominant alleles’.  This can come in handy later.

We’ll wrap this up with an object called ‘Locus’ to add some methods to this data.

public class Locus
{
   private locus_enum _locus;

   public Locus(locus_enum l)
    {
        _locus = l;
    }

    public bool Phenotype
    {
        get
        {
            return _locus == locus_enum.aa;
        }
    }

    public int A_Count
    {
        get
        {
            return (int)_locus;
        }
    }

    public override string ToString()
    {
        switch (_locus)
        {
            case locus_enum.aa:
                return "aa";
            case locus_enum.aA:
                return "aA";
            case locus_enum.AA:
                return "AA";
            default:
                return "ERROR!";
        }
    }

    private locus_enum locus
    {
        get
        {
            return _locus;
        }
    }
}

Next, we need to be able to manipulate genes. Specifically, we need to be able to cross them using Mendellian genetics.

Now the simple, brute force method would be to randomly select one allele from each locus and then combine them to create an offspring one.

We’re not going to do that, we’re going to use a probability matrix.

We know all the probabilities of producing a specific type of offspring given two parents, so we just need to generate some random numbers and throw them against the probabilities.

We shall take our example from ‘A Bit Of Genetics’:

If both parents are ‘aA’ then the offspring probabilities are:

  • aa — 25%
  • aA — 50%
  • AA — 25%

So we add these probabilities together to create a ‘line of probability’ that goes from 0.0 to 1.0.  We break the line up into chunks of the relative size of the probabilities like this:

Probability line diagram

So we can generate a random number between 0.0 and 1.0, find it on the line and the result is the value for that chunk of the line.  We can do all of this with a probability matrix that covers all the possible parental and offspring values.  The matrix has three dimensions, so it looks like a cube:

Offspring probability cube.

In C# this looks like:

// Prepare the offspring probability matrix.
p_offspring = new float[, ,] {

     {   {1.0F , 1.0F , 1.0F  },
         {0.5F , 1.0F , 1.0F  },
         {0.0F , 1.0F , 1.0F  }   
     },               
     {   {0.5F , 1.0F , 1.0F  },
         {0.25F, 0.75F, 1.0F  },
         {0.0F , 0.5F , 1.0F  }   
     },               
     {   {0.0F , 1.0F , 1.0F  },
         {0.0F , 0.5F , 1.0F  },
         {0.0F , 0.0F , 1.0F  }   
     }
};

We then need a method that will return an offspring when given two parents:

///<summary>
/// Crosses the two specified loci.
///</summary>
///<param name="o"></param>
///<param name="t"></param>
///<returns></returns>
public static Locus Cross(Locus o, Locus t)
{
    System.Random r = new Random();
    double off = r.NextDouble();

    Locus ret;

    /* check to see of the offspring is aa */
    if ((off <= p_offspring[o.A_Count, t.A_Count, (int)locus_enum.aa]) && (p_offspring[o.A_Count, t.A_Count, (int)locus_enum.aa] != 0))
    {
        ret = new Locus(locus_enum.aa);

        switch (o.locus)
        {
            case locus_enum.aa:
                if (t.A_Count == 2)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;
            case locus_enum.aA:
                if (t.A_Count == 2)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;
            case locus_enum.AA:
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;

        }

        return ret;
    }

    /* check to see of the offspring is aA */
    if ((off <= p_offspring[o.A_Count, t.A_Count, (int)locus_enum.aA]) && (p_offspring[o.A_Count, t.A_Count, (int)locus_enum.aA] != 0))
    /*    if (off <= p_offspring[o][t][aA])*/
    {
        ret = new Locus(locus_enum.aA);
        switch (o.locus)
        {
            case locus_enum.aa:
                if (t.A_Count == 0)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;
            case locus_enum.aA:
                break;

            case locus_enum.AA:
                if (t.A_Count == 2)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;

        }
        return ret;
    }

    /* check to see of the offspring is AA */
    if (off <= p_offspring[o.A_Count, t.A_Count, (int)locus_enum.AA])
    {
        ret = new Locus(locus_enum.AA);
        switch (o.locus)
        {
            case locus_enum.aa:
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;
            case locus_enum.aA:
                if (t.A_Count == 0)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;
            case locus_enum.AA:
                if (t.A_Count == 0)
                {
                    string message = string.Format("Parent one : {0},tParent two {1}tOffspring : {2}n", o.locus, t.locus, ret);
                    throw new Exception("Locus produced the WRONG offspring!nn" + message);
                }
                break;

        }
        return ret;
    }

    /* if we got here, there's been an error ! */
    throw new Exception("Probability screw-up in locus_cross()");

    /* return something to keep the compiler happy */
    return new Locus(locus_enum.aa);

}

To make them easily accessible, we’ll embed the probability matrix and the crossing method as static methods of the ‘Locus’ class which they act on.

I have put the code into a ZIP file at the Channel 9 Sandbox.  There is a single Visual Studio 2005 solution with an executable project that does nothing (yet) and a unit test assembly called SimulationTests. If you don’t have Visual Studio 2005, this project will also work in the Visual Studio C# Express version (which is currently free).  The unit tests are written using NUnit which you will have to install as well. So far you can just read the code and run the tests.

I shall be using this project as the basis for all the demonstrations I build up in the future.

 

If you’re interested in writing software, check out my other blog: Coding at The Coal Face