Tutorial: A New Open Source Framework for Genetic Algorithms.
Photo by Sangharsh Lohakare on Unsplash

Tutorial: A New Open Source Framework for Genetic Algorithms.

T.L.D.R

Here is the Google Colab with all the code: ImageFinderGA.ipynb - Colaboratory (google.com)

Here is Finch's GitHub: Finch: A Keras style GA genetic algorithm library (github.com)

What are Genetic Algorithms?

Genetic algorithms are a type of evolutionary algorithm that can evolve "solutions" to specific problems. They implement survival of the fittest, mutation, and parenting... They can solve problems much, much faster than brute-forcing algorithms. Given this, I was surprised to learn that very few robust genetic algorithm libraries exist. There is one that stands out called PyGad. PyGad is excellent for basic genetic algorithms but is not as flexible or extendable as I wanted. Hence, I set out to make my own. It's called FinchGA, for obvious reasons, and is built to be incredibly similar stylistically to Keras (a machine learning library). Here is a simple example of how to define a genetic algorithm "environment" in Finch:

env = SequentialEnvironment(layers=
    Layers.GenerateData(pool, population=4, array_length=image_size, delay=0,
    Layers.Parents(pool, gene_size=200, family_size=2, delay=50, every=100),
    Layers.OverPoweredMutation(pool, iterations=2000, fitness_function=f),
    Layers.SortFitness(),
    Layers.KeepLength(4), 
])        

This is similar to the Keras "Sequential" model. Because of this, you can create your own layers, and use any combination of built-in layers (there are a lot more than in the sample code). Each of them with many customizable behaviours. These layers allow Finch to far outpace other older libraries. For example here is an image recreated in Finch after 2,000 epochs. It is 98% similar to the original Image. Note: it reaches 90% similarity in about half that many epochs.

No alt text provided for this image

Here is the same image after 20,000 generations in PyGad:

No alt text provided for this image
Source: GitHub - PyGad

Here is the original image:

No alt text provided for this image


It should be noted that PyGad generations are generally faster than Finch epochs (but they perform tasks very differently). Also, PyGad is cool please check it out, it works amazingly well for other problems. Generally, most genetic algorithms that people make fail at one thing: cheating. In pursuit of making a genetic algorithm that mimics biological evolution, they forget that we can cheat. After all, the goal is not to simulate evolution, the goal is to solve a problem utilizing evolution. Its on a computer, genetic algorithms can do things nature won't, well, naturally do. We can take short-cuts in evolution. This is the ethos behind Finch. One way finch does this is the "OverPoweredMutation" layer. This layer mutates an individual gene and only keeps the mutation if the mutation benefits the fitness. It then incentivises mutating genes that have been mutated the least amount of times (as they are the worst). If this layer is too slow, there is a FastMutation layer that, as the name implies, is much faster. FastMutation still outperforms most mutation functions because it determines which genes are found in the fittest individuals. This allows the environment to learn things like which pixel colours are the most likely to be in the picture and incentivise the mutation to these genes.

Tutorial

Installation:



git clone -b main https://github.com/dadukhankevin/Finch

pip install imageio        

Import libraries



from Finch.FinchGA.GenePools import GenePool, TypedGenePool, FloatPoo
from Finch.FinchGA import Layers, EvolveRates
from Finch.FinchGA.Environments import *
from Finch.FinchGA.FitnessFunctions import ImageSimilarity
from Finch.FinchGA.generic import*
import numpy as np
import imageio
import matplotlib.pyplot as pltl        

Read our image



target_im = imageio.imread("Finch/ExampleData/fruit.jpg")
target_im = np.asarray(target_im, dtype=np.float))        

Create our fitness function. Fitness functions allow us to sort all the "individuals" in our population by how fit they are. The more fit an individual is the more its genes get propagated throughout the population and the less likely it is to die (survival of the fittest). To do this we will use the built in ImageSimilarity class which will contain our fitness function.



fitnessClass = ImageSimilarity("/content/Finch/ExampleData/fruit.jpg"
image_size = len(fitnessClass.target_genes)
print("Array length: ", image_size))        

Here lies the real *magic*. In Finch almost every parameter in a layer that takes an number can also take a "Rate". A rate allows you to change each parameter as the environment runs. Here we will define our rate such it will return 1 and then decrease to .05 over the course of 1500 epochs (or generations). This rate will allow us to change the amount each pixel colour (r, g, b) is mutated and will decrease over time (because our image will become more fit).



rate = EvolveRates.Rate(1, 0.05, 1500, return_int=False)        

Now we need a gene pool. In Finch there are 3 different types but for now lets just worry about the FloatPool. This defines the range our pixel color values can be generated to be on a scale of 0 to 1. We will also give it our fitness function which will be used behind the scenes in our gene pool for many different things. Gene pools also help modify what genes are produced by GRN layer (gene regulatory network layers) but we won't be using those in this tutorial.



pool = FloatPool(0, 1, fitnessClass.fitness_fun)         

Next, lets define our environment. In finch an environment defines all the "layers" that will change, mutate, and create children of our individuals (among many other things).





env = SequentialEnvironment(layers=
    Layers.GenerateData(pool, population=4, array_length=image_size, delay=0),
    Layers.Parents(pool, gene_size=200, family_size=2, delay=1, every=1, method="best", amount=2),
    Layers.OverPoweredMutation(pool, iterations=2000, index=-1, fitness_function=fitnessClass.fitness_fun, range_rate=rate.get, method="smart"),
    Layers.SortFitness(),
    Layers.KeepLength(4), 
])        

Here we will.

  1. Generate 4 random images
  2. Parent the top 2 most fit images. This will create 2 children that are essentially a mix of the parents.
  3. Mutate the data, and only keep mutations that help the fitness.
  4. Sort the individuals by fitness (how similar they are to our picture).
  5. Keep the population below 4. (this will keep our environment fast)

Now we can create a function that will show the best image after every n epochs (iterations, generations or loops).



n = 0
def callback(): 
   global n
   if n%50 == 0:
     try:
      print(env.best_ind)
      sotion = env.best_ind.genes
      print(solution.shape)
      result = genes2imgolution, target_im.shape)       
      plt.imshow(result)       
      plt.title(inch")
      plt.show()     
     except:
      pass
      n += 10        

Now let us compile and run our model! In the callbacks= parameter we will put rate.next. This will decrease our rate each epoch. We will also place our callback function so we can see our images as they progress



env.compile(epochs=1000, fitness=fitnessClass, every=10, callbacks=[rate.next, callback])


data, hist = env.simulate_env())        

Et voila! Now lets let it evolve!

There are lots of other things we can do, like define multiple environments and placing our old data into env2.compile(...data=data...). Here is a google Colab with all the code!

ImageFinderGA.ipynb - Colaboratory (google.com)

Conclusion:

Genetic algorithms are highly underdeveloped and utilized. If an ounce of the genius going into creating neural networks went into genetic algorithms the field would increase in utility exponentially. They provide a way to find "truth" that AI simply does not do and they require no training data to do so. This also limits their harmful bias in comparison to AI. If the fields would work together and combine I believe we could see some really cool things.

Further testing:

  1. Try changing different parameters and see what happens.
  2. Change the fitness function to whatever you want (you can look at the other examples on GitHub to learn how to do this).
  3. Try it with much larger images (it takes a bit longer).
  4. Feel free to contribute on GitHub!

Sources:

Finch (this library):

Finch: A Keras style GA genetic algorithm library (github.com)

PyGad:


@misc{gad2021pygad,
      title={PyGAD: An Intuitive Genetic Algorithm Python Library}, 
      author={Ahmed Fawzy Gad},
      year={2021},
      eprint={2106.06158},
      archivePrefix={arXiv},
      primaryClass={cs.NE}
}        

To view or add a comment, sign in

Others also viewed

Explore content categories