This program illustrates a fallacy, that a bunch of mutations (or a string of amino acids) can come about
only by a probability so infinitesimally small that it can safely be discounted as ever being possible.

The fallacy is based on a simple misapplication of probability theory, where the probability of a bunch of events
is the product of the probabilities of each individual event.

For example, if we consider five mutations, each with the probability of occurring of 1/100,
then the probability of all of them happening, according to the fallacy, is 1/100 x 1/100 x 1/100 x1/100 x 1/100
= 1/10,000,000,000 (one in ten billion).

In other words, we'd need 10 billion trials to get 5 mutations.

Why is this a fallacy? Because the calculation is only valid for totally independent events.

In other situtions, the events are not independent. Here, we have a population of 100 "creatures",
each with 5 elements that may mutate with a 1/100 chance on reproduction. Each mutation is advantageous.
A creature with mutations is more likely to reproduce more.

Each line in the output represents a generation. Each number represents a creature.
The value of the number is the number of mutations it possesses.

See how many trials it takes to get our "one in 10 billion" chance of 5 mutations!

Click refresh/reload to re-run the simulation as many times as you wish. The source code for the simulation can be found here.