This week I've been having a play around with genetic algorithms and had a tonne of fun.
What are genetic algorithms?
Genetic algorithms are inspired by Charles Darwin's theory of evolution. The premise being the most successful individuals reproduce and pass on their genetic traits to their offspring, i.e. natural selection.
There are 5 steps to the algorithm:
- Creating a new generation
- Evaluating individuals
- Selection
- Reproduction
- Mutation
And these steps are repeated continuously until you find your goal. These steps will be explained in more detail in the example below.
My first genetic algorithm
There's no better way to understand an algorithm than to write it yourself, so I did exactly that. My first genetic algorithm was made using JavaScript with the p5.js library.
The goal for the algorithm is very simple, go from one point to another, and the quicker you get there the better. So I'm going to spawn a circle, that has a random set of forces that get applied to it sequentially - this is its genes.
createGenes() {
let s = [];
for (let j = 0; j < GENE_LENGTH; j++) {
s[j] = p5.Vector.random2D();
}
return s;
}
Now let's create a number of these individuals, this is our population:
Now, after a certain amount of time or if one of the individuals reach the target I'll need to select the most successful in the population in order to reproduce. To do this we need to calculate its fitness function. Fitness is a value that'll represent how successful an individual was, usually the value will be normalised - an individual with a 0.9 fitness is better than an individual with 0.3 fitness.
So to calculate the fitness for my example I need to determine how far away from the goal they are, the closer they are to the goal the higher the fitness:
calcFitness(pos) {
const distanceToGoal = dist(pos.x, pos.y, goal.x, goal.y);
let normalised = distanceToGoal / height;
this.fitness = 1 - normalised;
}
The penultimate step is selection, selection is just a means of giving the successful individuals a higher chance of reproducing. To do this I'm going to stick the individuals into an array proportional to their fitness score. So an individual with a fitness of 0.9 will get put into the array 90 times and an individual with a fitness of 0.5 will get put in 50 times. The next generation will be randomly selected from this array, so each subsequent generation should - in theory - get stronger.
function naturalSelection() {
matingPool = [];
for (let pop of population) {
let n = floor(pop.fitness() * 100);
for (let i = 0; i < n; i++) {
matingPool.push(pop);
}
}
}
The last step is reproduction, so using that array we've just created we will find two random individuals and cross their genetic material to create a child.
function reproduce() {
for (let i = 0; i < population.length; i++) {
let mummyIndex = floor(random(matingPool.length));
let daddyIndex = floor(random(matingPool.length));
let mummy = matingPool[mummyIndex];
let daddy = matingPool[daddyIndex];
let child = mummy.crossover(daddy);
child.mutate(mutationRate);
population[i] = child;
}
}
The term for the mixing of the genes is crossover which can be done in a number of ways, the approach I'm taking is simply combining the genes so that each element alternates between mummy gene and daddy gene:
[M,D,M,D,M,D,M,D,M,D,M,D]
crossover(partner) {
let child = new DNA();
child.genes = [];
for (let i = 0; i < this.genes.length; i++) {
if (i % 2 ==0) {
child.genes.push(this.genes[i]);
} else {
child.genes.push(partner.genes[i]);
}
}
return child;
}
Another example for crossover is to literally split the genes in half, so the first half of the child's genes are from daddy and the second half are from mummy.
You'll notice, in the code before the last snippet there's a line which calls a mutate function, this is the final step - the mutation step allows us to maintain genetic diversity much like biological mutation. The function takes a mutationRate
which is some percentage (2% in my example).
So the code here is quite simple, I loop through every gene and if the random value is less than the mutationRate
then I'll create a random vector.
mutate(mutationRate) {
for (let i = 0; i < this.genes.length; i++) {
if (random(1) < mutationRate) {
this.genes[i] = p5.Vector.random2D();
}
}
}
Let's see it in action:
Generation 0
All movements are completely random and no individuals reach the goal.
Generation 20
All individuals seem to be heading in the correct direction - up. There's still quite a lot of random side-to-side movement though. But at least they reach the destination!
Generation 50
It's now reaching the goal with a bit of speed! So much room to improve though, I'll let it run for a while!
Generation 550
Now we're talking ⏩⏩⏩
Genetic algorithm racer
Getting a circle to fly up is really quite a simple problem, I wanted to up the ante a little and create a racing game where the car would have to find its way around the track. Note, all the code for the following Genetic Algorithm can be found here.
Okay, I should probably add some hit detection and some way to calculate their fitness. I'm going to start off really simple and assume that the longer the car is alive, the more successful it is.
calcFitness(timeAlive) {
this.fitness = map(timeAlive, 0, 10000, 0, 1);
}
Everything else is pretty much the same as the previous genetic algorithm, apart from, the individual's genetic code represents an angle to turn the car - they move at a constant speed, the only thing that changes is the steering.
Let's see them drive!
Generation 0
Generation 10
We've cleared the first corner (The green car represents the last most successful car, I add that to the gene pool each iteration so it doesn't get any worse!)
Generation 50
We've got around the second corner - nearly. Not many cars seem to be reaching near to the end, maybe the mutation rate is a little high at 3%.
Insufficient fitness function
The fitness function I'm using is pretty awful, if you take a look at the example below. The car that gets the furthest isn't considered the best because it didn't stay alive for the longest time. So my fitness function is essentially promoting poor driving!
The better fitness function
I'm sure that given enough time, my old fitness function might well complete the track after a few million generations, I lack the patience to see.
Checkpoints
I'm going to update the track so that it it has checkpoints, if a car passes the checkpoint it'll add to their score, their score is their fitness function. This method should hopefully prevent the previous issues.
The circles represent the checkpoints (They won't be visible)
I've also added some code to alter how the mutation works, so that it only mutates genes near to where the car's ancestors crashed. So basically I won't be losing any cars early on.
mutate(mutationRate) {
for (let i = ancestorDiePoint-buffer; i < this.genes.length; i++) {
if (random(1) < mutationRate) {
this.genes[i] = random(-TURN_MAX, TURN_MAX);
}
}
}
Generation 50:
By Generation 50 it looks as though the end is in sight!
Generation 250 🥳:
Finally
Yey, the car made it. I created a genetic algorithm that can drive itself around a track, that's great. But now, what if I want to export the genes for the car that reached the end and put it onto another track? It'd be useless. Why? Because the steering/dna is specific to this current track. If we want to create a general-purpose self-driving racing car that can be plonked onto any track then we're going to need something a little more than a genetic algorithm, we're going to need a Neural Network which I'm going to be covering in my next blog, next week!
This blog was largely inspired by Daniel Shiffman's YouTube course on genetic algorithms so big thanks to him!
I hope you've enjoyed this blog, if you do by some miracle enjoy my blabbering then head over to my blogging site at codeheir.com where I write weekly blogs about whatever in the world of programming has my attention!
Top comments (4)
Got to see such an amazing post thanks to John Fish's video .
youtu.be/_Vxjh1QxApA
That's awesome! Thank you for pointing me to this 😄
Thank you, I appreciate that 😆
I’ve not read it before actually, I’ll take a look, thank you for sharing!
Interesting work! Me too watching the GA series by Daniel on YouTube. I've always thought GA needed a net to work. But you've proven that's not the case. Consider me a subscriber to your next post ;)