遗传算法最小化的简单示例

数据挖掘 Python 优化 遗传算法
2021-09-21 15:00:42

我一直在寻找有关如何使用 Python 中的遗传算法方法找到函数达到其最小值的点的示例。我查看了 DEAP 文档,但其中的示例对我来说很难理解。例如:

def  function(x,y):
     return x*y+3*x-x**2

我正在寻找一些关于如何制作遗传算法的参考资料,在该算法中我可以为 x 和 y 提供一些初始随机值(不是来自相同的维度)。有创建和使用遗传算法经验的人可以给我一些指导吗?

1个回答

这是一个简单的示例,它比您提供的多项式更有意义地捕捉了遗传算法的本质。您提供的多项式可以通过 求解stochastic gradient descent,这是一种更简单的最小化技术。出于这个原因,我转而推荐 Will Larson 撰写的这篇出色的文章和示例。

引用原文章

定义要优化的问题现在我们将整理一个在 Python 中使用遗传算法的简单示例。我们将优化一个非常简单的问题:尝试创建一个包含 N 个数字的列表,当它们相加时等于 X。

如果我们设置 N = 5 和 X = 200,那么这些都是合适的解决方案。

lst = [40,40,40,40,40]
lst = [50,50,50,25,25]
lst = [200,0,0,0,0]

看看整篇文章,但这里是完整的代码

# Example usage
from genetic import *
target = 371
p_count = 100
i_length = 6
i_min = 0
i_max = 100
p = population(p_count, i_length, i_min, i_max)
fitness_history = [grade(p, target),]
for i in xrange(100):
    p = evolve(p, target)
    fitness_history.append(grade(p, target))

for datum in fitness_history:
   print datum
"""
from random import randint, random
from operator import add

def individual(length, min, max):
    'Create a member of the population.'
    return [ randint(min,max) for x in xrange(length) ]

def population(count, length, min, max):
    """
    Create a number of individuals (i.e. a population).

    count: the number of individuals in the population
    length: the number of values per individual
    min: the minimum possible value in an individual's list of values
    max: the maximum possible value in an individual's list of values

    """
    return [ individual(length, min, max) for x in xrange(count) ]

def fitness(individual, target):
    """
    Determine the fitness of an individual. Higher is better.

    individual: the individual to evaluate
    target: the target number individuals are aiming for
    """
    sum = reduce(add, individual, 0)
    return abs(target-sum)

def grade(pop, target):
    'Find average fitness for a population.'
    summed = reduce(add, (fitness(x, target) for x in pop))
    return summed / (len(pop) * 1.0)

def evolve(pop, target, retain=0.2, random_select=0.05, mutate=0.01):
    graded = [ (fitness(x, target), x) for x in pop]
    graded = [ x[1] for x in sorted(graded)]
    retain_length = int(len(graded)*retain)
    parents = graded[:retain_length]
    # randomly add other individuals to
    # promote genetic diversity
    for individual in graded[retain_length:]:
        if random_select > random():
            parents.append(individual)
    # mutate some individuals
    for individual in parents:
        if mutate > random():
            pos_to_mutate = randint(0, len(individual)-1)
            # this mutation is not ideal, because it
            # restricts the range of possible values,
            # but the function is unaware of the min/max
            # values used to create the individuals,
            individual[pos_to_mutate] = randint(
                min(individual), max(individual))
    # crossover parents to create children
    parents_length = len(parents)
    desired_length = len(pop) - parents_length
    children = []
    while len(children) < desired_length:
        male = randint(0, parents_length-1)
        female = randint(0, parents_length-1)
        if male != female:
            male = parents[male]
            female = parents[female]
            half = len(male) / 2
            child = male[:half] + female[half:]
            children.append(child)
    parents.extend(children)
    return parents

stochastic grid search我认为使用此算法解决您的原始问题,然后使用or构造一个解决方案可能在教学上非常有用stochastic gradient descent,您将对这三种算法的并置有深入的了解。

希望这可以帮助!