Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).
run()
method completes, update the 2 instance attributes: 1) last_generation_parents
2) last_generation_parents_indices
. This is to keep the list of parents up-to-date with the latest population fitness last_generation_fitness
. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/275
run_
. Their purpose is to keep the main loop inside the run()
method clean. Check the [Other Methods](https://pygad.readthedocs.io/en/latest/pygad.html#other-methods) section for more information.Release Date 29 January 2024
stop_ciiteria
parameter is used with the reach
keyword, then multiple numeric values can be passed when solving a multi-objective problem. For example, if a problem has 3 objective functions, then stop_criteria="reach_10_20_30"
means the GA stops if the fitness of the 3 objectives are at least 10, 20, and 30, respectively. The number values must match the number of objective functions. If a single value found (e.g. stop_criteria=reach_5
) when solving a multi-objective problem, then it is used across all the objectives. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/238
delay_after_gen
parameter is now deprecated and will be removed in a future release. If it is necessary to have a time delay after each generation, then assign a callback function/method to the on_generation
parameter to pause the evolution.gene_space
parameter without a step, then mutation occurs by adding a random value to the gene value. The random vaue is generated based on the 2 parameters random_mutation_min_val
and random_mutation_max_val
. For more information, check the [How Mutation Works with the gene_space Parameter?](https://pygad.readthedocs.io/en/latest/pygad_more.html#how-mutation-works-with-the-gene-space-parameter) section. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/229
object
as a supported data type for int (GA.supported_int_types) and float (GA.supported_float_types). https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/174
raise
clause instead of the sys.exit(-1)
to terminate the execution. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/213
fitness_batch_size
set to a non-zero number).pygad.py
script when finding the index of the best solution. It does not work properly with multi-objective optimization where self.best_solutions_fitness
have multiple columns. self.best_solution_generation = numpy.where(numpy.array(
self.best_solutions_fitness) == numpy.max(numpy.array(self.best_solutions_fitness)))[0][0]
pygad.utils.nsga2
is created that has the NSGA2
class that includes the functionalities of NSGA-II. The class has these methods: 1) get_non_dominated_set()
2) non_dominated_sorting()
3) crowding_distance()
4) sort_solutions_nsga2()
. Check [this section](https://pygad.readthedocs.io/en/latest/pygad_more.html#multi-objective-optimization) for an example.NSGA2
class in the pygad.utils.nsga2
module. Just return a list
, tuple
, or numpy.ndarray
from the fitness function and the library will consider the problem as multi-objective optimization. All the objectives are expected to be maximization. Check [this section](https://pygad.readthedocs.io/en/latest/pygad_more.html#multi-objective-optimization) for an example.pygad.utils.parent_selection
module: 1) Tournament selection for NSGA-II 2) NSGA-II selection.plot_fitness()
method in the pygad.plot
module has a new optional parameter named label
to accept the label of the plots. This is only used for multi-objective problems. Otherwise, it is ignored. It defaults to None
and accepts a list
, tuple
, or numpy.ndarray
. The labels are used in a legend inside the plot.pygad.plot
module is changed to the greenish #64f20c
color.pareto_fronts
added to the pygad.GA
instances that holds the pareto fronts when solving a multi-objective problem.gene_type
accepts a list
, tuple
, or numpy.ndarray
for integer data types given that the precision is set to None
(e.g. gene_type=[float, [int, None]]
).cal_pop_fitness()
method, the fitness value is re-used if save_best_solutions=True
and the solution is found in the best_solutions
attribute. These parameters also can help re-using the fitness of a solution instead of calling the fitness function: keep_elitism
, keep_parents
, and save_solutions
.99999999999
is replaced by float('inf')
in the 2 methods wheel_cumulative_probs()
and stochastic_universal_selection()
inside the pygad.utils.parent_selection.ParentSelection
class.plot_result()
method in the pygad.visualize.plot.Plot
class is removed. Instead, please use the plot_fitness()
if you did not upgrade yet.Release Date 20 June 2023
gene_space
parameter can no longer be assigned a tuple.gene_space
parameter has a member of type tuple
.gene_space_unpacked
which has the unpacked gene_space
. It is used to solve duplicates. For infinite ranges in the gene_space
, they are unpacked to a limited number of values (e.g. 100).gene_space
attribute.dict
is used with the gene_space
attribute, the new gene value was calculated by summing 2 values: 1) the value sampled from the dict
2) a random value returned from the random mutation range defined by the 2 parameters random_mutation_min_val
and random_mutation_max_val
. This might cause the gene value to exceed the range limit defined in the gene_space
. To respect the gene_space
range, this release only returns the value from the dict
without summing it to a random value.format()
method. https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189
__init__()
of the pygad.GA
class, the logged error messages are handled using a try-except
block instead of repeating the logger.error()
command. https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189
CustomLogger
is created in the pygad.cnn
module to create a default logger using the logging
module assigned to the logger
attribute. This class is extended in all other classes in the module. The constructors of these classes have a new parameter named logger
which defaults to None
. If no logger is passed, then the default logger in the CustomLogger
class is used.pygad.nn
module, the print()
function in all other modules are replaced by the logging
module to log messages.on_fitness()
, on_parents()
, on_crossover()
, and on_mutation()
can return values. These returned values override the corresponding properties. The output of on_fitness()
overrides the population fitness. The on_parents()
function/method must return 2 values representing the parents and their indices. The output of on_crossover()
overrides the crossover offspring. The output of on_mutation()
overrides the mutation offspring.fitness_batch_size
>1. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/195
allow_duplicate_genes=False
and a user-defined gene_space
is used, it sometimes happen that there is no room to solve the duplicates between the 2 genes by simply replacing the value of one gene by another gene. This release tries to solve such duplicates by looking for a third gene that will help in solving the duplicates. These examples explain how it works. Check [this section](https://pygad.readthedocs.io/en/latest/pygad.html#prevent-duplicates-in-gene-values) for more information.random_mutation_min_val
and random_mutation_max_val
can accept iterables (list/tuple/numpy.ndarray) with length equal to the number of genes. This enables customizing the mutation range for each individual gene. https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198
init_range_low
and init_range_high
can accept iterables (list/tuple/numpy.ndarray) with length equal to the number of genes. This enables customizing the initial range for each individual gene when creating the initial population.data
parameter in the predict()
function of the pygad.kerasga
module can be assigned a data generator. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/115 https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207
predict()
function of the pygad.kerasga
module accepts 3 optional parameters: 1) batch_size=None
, verbose=0
, and steps=None
. Check documentation of the [Keras Model.predict()](https://keras.io/api/models/model_training_apis) method for more information. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207
gene_space
is used with int
or float
data types. Check [this section](https://pygad.readthedocs.io/en/latest/pygad.html#limit-the-gene-value-range-using-the-gene-space-parameter). https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198
Fix an issue with passing user-defined function/method for parent selection. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/179
This release has a major change where the fitness function accepts a mandatory parameter referring to the instance of the pygad.GA
class.
This is the release notes:
pygad.py
module are moved to the pygad.utils
, pygad.helper
, and pygad.visualize
submodules.pygad.utils.parent_selection
module has a class named ParentSelection
where all the parent selection operators exist. The pygad.GA
class extends this class.pygad.utils.crossover
module has a class named Crossover
where all the crossover operators exist. The pygad.GA
class extends this class.pygad.utils.mutation
module has a class named Mutation
where all the mutation operators exist. The pygad.GA
class extends this class.pygad.helper.unique
module has a class named Unique
some helper methods exist to solve duplicate genes and make sure every gene is unique. The pygad.GA
class extends this class.pygad.visualize.plot
module has a class named Plot
where all the methods that create plots exist. The pygad.GA
class extends this class....
class GA(utils.parent_selection.ParentSelection,
utils.crossover.Crossover,
utils.mutation.Mutation,
helper.unique.Unique,
visualize.plot.Plot):
...
logging
module to log the outputs to both the console and text file instead of using the print()
function. This is by assigning the logging.Logger
to the new logger
parameter. Check the [Logging Outputs](https://pygad.readthedocs.io/en/latest/README_pygad_ReadTheDocs.html#logging-outputs) for more information.logger
to save the logger.fitness_func
parameter accepts a new parameter that refers to the instance of the pygad.GA
class. Check this for an example: [Use Functions and Methods to Build Fitness Function and Callbacks](https://pygad.readthedocs.io/en/latest/README_pygad_ReadTheDocs.html#use-functions-and-methods-to-build-fitness-and-callbacks). https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/163
initial_population
parameter.pop_fitness
parameter of the best_solution()
method.random_mutation_min_val
and random_mutation_max_val
) instead of using the parameters init_range_low
and init_range_high
.summary()
method returns the summary as a single-line string. Just log/print the returned string it to see it properly.callback_generation
parameter is removed. Use the on_generation
parameter instead.parallel_processing
parameter with Keras and PyTorch. As Keras/PyTorch are not thread-safe, the predict()
method gives incorrect and weird results when more than 1 thread is used. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/145 https://github.com/ahmedfgad/TorchGA/issues/5 https://github.com/ahmedfgad/KerasGA/issues/6. Thanks to this [StackOverflow answer](https://stackoverflow.com/a/75606666/5426539).numpy.float
by float
in the 2 parent selection operators roulette wheel and stochastic universal. https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/168
summary()
method is supported to return a Keras-like summary of the PyGAD lifecycle.fitness_batch_size
is supported to calculate the fitness function in batches. If it is assigned the value 1
or None
(default), then the normal flow is used where the fitness function is called for each individual solution. If the fitness_batch_size
parameter is assigned a value satisfying this condition 1 < fitness_batch_size <= sol_per_pop
, then the solutions are grouped into batches of size fitness_batch_size
and the fitness function is called once for each batch. In this case, the fitness function must return a list/tuple/numpy.ndarray with a length equal to the number of solutions passed. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/136.cloudpickle
library (https://github.com/cloudpipe/cloudpickle) is used instead of the pickle
library to pickle the pygad.GA
objects. This solves the issue of having to redefine the functions (e.g. fitness function). The cloudpickle
library is added as a dependancy in the requirements.txt
file. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/159
fitness_func
, crossover_type
, mutation_type
, parent_selection_type
, on_start
, on_fitness
, on_parents
, on_crossover
, on_mutation
, on_generation
, and on_stop
. https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/92 https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138
allow_duplicate_genes=True
. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/39
set()
is no longer supported in Python 3.11. Instead, sampling happens from a list()
. Thanks Marco Brenna
for pointing to this issue.save_solutions=True
that causes the fitness function to be called for solutions already explored and have their fitness pre-calculated. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/160
last_generation_elitism_indices
added to hold the indices of the selected elitism. This attribute helps to re-use the fitness of the elitism instead of calling the fitness function.best_solution()
method which in turns saves some calls to the fitness function.cal_pop_fitness()
method. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/79#issuecomment-1439605442
PyGAD 2.18.2 release notes
numpy.int
and numpy.float
from the list of supported data types. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/151 https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/152
on_crossover()
callback function even if crossover_type
is None
. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138
on_mutation()
callback function even if mutation_type
is None
. https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138