Skip navigation links

Package org.apache.mahout.ep

Provides basic evolutionary optimization using recorded-step mutation.

See: Description

Package org.apache.mahout.ep Description

Provides basic evolutionary optimization using recorded-step mutation.

With this style of optimization, we can optimize a function f: R^n -> R by stochastic hill-climbing with some of the benefits of conjugate gradient style history encoded in the mutation function. This mutation function will adapt to allow weakly directed search rather than using the somewhat more conventional symmetric Gaussian.

With recorded-step mutation, the meta-mutation parameters are all auto-encoded in the current state of each point. This avoids the classic problem of having more mutation rate parameters than are in the original state and then requiring even more parameters to describe the meta-mutation rate. Instead, we store the previous point and one omni-directional mutation component. Mutation is performed by first mutating along the line formed by the previous and current points and then adding a scaled symmetric Gaussian. The magnitude of the omni-directional mutation is then mutated using itself as a scale.

Because it is convenient to not restrict the parameter space, this package also provides convenient parameter mapping methods. These mapping methods map the set of reals to a finite open interval (a,b) in such a way that lim_{x->-\inf} f(x) = a and lim_{x->\inf} f(x) = b. The linear mapping is defined so that f(0) = (a+b)/2 and the exponential mapping requires that a and b are both positive and has f(0) = sqrt(ab). The linear mapping is useful for values that must stay roughly within a range but which are roughly uniform within the center of that range. The exponential mapping is useful for values that must stay within a range but whose distribution is roughly exponential near geometric mean of the end-points. An identity mapping is also supplied.

Skip navigation links

Copyright © 2008–2017 The Apache Software Foundation. All rights reserved.