Bitmask evolution optimization method from Aleksey Vaneev

DSP, Plugin and Host development discussion.
RELATED
PRODUCTS

Post

I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.

The URL is: https://github.com/avaneev/biteopt
Image

Post

Just curious, what's the use of this algorithm in music-dsp? Optimize something like this? http://www.convexoptimization.com/wikim ... verberator
~stratum~

Post

stratum wrote:Just curious, what's the use of this algorithm in music-dsp? Optimize something like this? http://www.convexoptimization.com/wikim ... verberator
It can be used for that as well, if you can formulate the optimization problem as a "cost" function.
Image

Post

It can be used for that as well, if you can formulate the optimization problem as a "cost" function.
Probably a cost function can be defined for that in terms of "uncoloredness of the impulse response". I didn't read about it much though.
Thanks for sharing the code.

edit: Apparently the link itself states a more important objective.
~stratum~

Post

Very good and straightforward, thank you

Post

Aleksey Vaneev wrote:I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.

The URL is: https://github.com/avaneev/biteopt
Seems to be close to genetic algorithms when you don't keep the ancesters?

Post

Miles1981 wrote:
Aleksey Vaneev wrote:I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.

The URL is: https://github.com/avaneev/biteopt
Seems to be close to genetic algorithms when you don't keep the ancesters?
Yes, this method does not keep ancestors. But as addition to the strategy it does keep a history of best solutions, and performs crossing-over (by default in 10% of steps). However, this does not always result in a faster convergence.
Image

Post

OK, so a variation of genetic algorithms.
I tried at some point some comparisons with Numerical Recipes' simulated annealing (which has nothing to do with simulated annealing...), and GA are usually quite fast, even if they don't find the minimum in all cases.

Did you compare your hybridation process (mutate in 90% of the steps) with the usual hybridation process (mutate a proportion of each step, cross over for another proportion and keep another proportion, all depending on the set of best candidates)?

Post

Miles1981 wrote:OK, so a variation of genetic algorithms.
I tried at some point some comparisons with Numerical Recipes' simulated annealing (which has nothing to do with simulated annealing...), and GA are usually quite fast, even if they don't find the minimum in all cases.

Did you compare your hybridation process (mutate in 90% of the steps) with the usual hybridation process (mutate a proportion of each step, cross over for another proportion and keep another proportion, all depending on the set of best candidates)?
So far I made no comparisons to other genetic approaches. What I've discovered is that in my approach simultaneous "mutation" and crossing-over does not work well - in some cases it has benefit, but not always. These operations work independently, but stochastically in tandem reduce convergence time.
Image

Post

It's quite cool: I was able to simplify the optimizer further, now includes the CBEOOptimizer2 class which does not even require crossing-over and history of best solutions. Yet the convergence time is quite low. However, I can't quite describe why its accidentally discovered mysterious "previous attempt intermix" operation works.

Global optimization is still beyond the reach of this strategy, but maybe I can find some solution eventually.
Image

Post

What is an evolutionary algorithm? I am none familiar with it.

Post

camsr wrote:What is an evolutionary algorithm? I am none familiar with it.
Evolutionary optimization strategy means the current solution is "evolved" to the next state from its previous state by means of random operators. These strategies also use crossing-over operation that involves mixing with ancestors or previous best solutions.

Meanwhile, I was able to further improve the strategy. I also made comparisons to Nelder-Mead simplex method, and my strategy proves to be more robust (stably produces a solution from any initial state) and converges 2 times faster when optimizing most functions I've tried. The Nelder-Mead method in half of the cases stopped prematurely producing a sub-optimal solution. So, it's a win.
Image

Post

Nelder Mead is a local optimization algorithm, whereas GA are global ones. The fact that they are global helps them in tricky valleys of the cost function IMHO, so not really surprised by this result ;)

Post

Miles1981 wrote:Nelder Mead is a local optimization algorithm, whereas GA are global ones. The fact that they are global helps them in tricky valleys of the cost function IMHO, so not really surprised by this result ;)
GAs are not equally good at global optimization. My strategy also strives to provide the global best solution, it is capable of that. The current problem is that when the global minimum's shape is very narrow and there are many local minimums, the strategy chooses a sub-optimal minimum - it just can't "hit" global minimum's position with a lower cost.

The problematic function I'm not struggling with is

Code: Select all

y = 100 * sqrt( fabs( y - 0.01 * x * x )) + 0.01 * fabs( x + 10 )
Functions I've tried with the Nelder Mead method are mostly local optimization problems. So, the comparison was fair.
Image

Post

It's been a while since I looked at this sort of stuff but it's always difficult to be sure of reaching the absolute optimal solution in difficult energy landscapes. I worked with GAs and found in the problem domain I was looking at (load balancing for finite element problems) took quite a bit of tinkering with mutation rates to climb out of local minima. Perhaps some sort of hybrid method would help i.e. use the evolutionary method for a few iterations and then switch to a local optimization scheme to reduce iterations required.

Post Reply

Return to “DSP and Plugin Development”