Bitmask evolution optimization method from Aleksey Vaneev
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.
The URL is: https://github.com/avaneev/biteopt
The URL is: https://github.com/avaneev/biteopt
-
- KVRAF
- 2256 posts since 29 May, 2012
Just curious, what's the use of this algorithm in music-dsp? Optimize something like this? http://www.convexoptimization.com/wikim ... verberator
~stratum~
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
It can be used for that as well, if you can formulate the optimization problem as a "cost" function.stratum wrote:Just curious, what's the use of this algorithm in music-dsp? Optimize something like this? http://www.convexoptimization.com/wikim ... verberator
-
- KVRAF
- 2256 posts since 29 May, 2012
Probably a cost function can be defined for that in terms of "uncoloredness of the impulse response". I didn't read about it much though.It can be used for that as well, if you can formulate the optimization problem as a "cost" function.
Thanks for sharing the code.
edit: Apparently the link itself states a more important objective.
~stratum~
-
Zaphod (giancarlo) Zaphod (giancarlo) https://www.kvraudio.com/forum/memberlist.php?mode=viewprofile&u=111268
- KVRAF
- 2596 posts since 23 Jun, 2006
Very good and straightforward, thank you
-
- KVRian
- 1379 posts since 26 Apr, 2004 from UK
Seems to be close to genetic algorithms when you don't keep the ancesters?Aleksey Vaneev wrote:I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.
The URL is: https://github.com/avaneev/biteopt
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
Yes, this method does not keep ancestors. But as addition to the strategy it does keep a history of best solutions, and performs crossing-over (by default in 10% of steps). However, this does not always result in a faster convergence.Miles1981 wrote:Seems to be close to genetic algorithms when you don't keep the ancesters?Aleksey Vaneev wrote:I've come up with an interesting and very simple evolutionary optimization method. Since I have not seriously studied the "state of art" of evolution optimization strategies, I may be recreating a prior art. But nevertheless the offered method is quite interesting in itself. You invert a random range of lower bits of parameter values and it just works. I've added a couple of tricks on top of that to increase robustness. Convergence time is quite huge as with most evolutionary optimization methods.
The URL is: https://github.com/avaneev/biteopt
-
- KVRian
- 1379 posts since 26 Apr, 2004 from UK
OK, so a variation of genetic algorithms.
I tried at some point some comparisons with Numerical Recipes' simulated annealing (which has nothing to do with simulated annealing...), and GA are usually quite fast, even if they don't find the minimum in all cases.
Did you compare your hybridation process (mutate in 90% of the steps) with the usual hybridation process (mutate a proportion of each step, cross over for another proportion and keep another proportion, all depending on the set of best candidates)?
I tried at some point some comparisons with Numerical Recipes' simulated annealing (which has nothing to do with simulated annealing...), and GA are usually quite fast, even if they don't find the minimum in all cases.
Did you compare your hybridation process (mutate in 90% of the steps) with the usual hybridation process (mutate a proportion of each step, cross over for another proportion and keep another proportion, all depending on the set of best candidates)?
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
So far I made no comparisons to other genetic approaches. What I've discovered is that in my approach simultaneous "mutation" and crossing-over does not work well - in some cases it has benefit, but not always. These operations work independently, but stochastically in tandem reduce convergence time.Miles1981 wrote:OK, so a variation of genetic algorithms.
I tried at some point some comparisons with Numerical Recipes' simulated annealing (which has nothing to do with simulated annealing...), and GA are usually quite fast, even if they don't find the minimum in all cases.
Did you compare your hybridation process (mutate in 90% of the steps) with the usual hybridation process (mutate a proportion of each step, cross over for another proportion and keep another proportion, all depending on the set of best candidates)?
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
It's quite cool: I was able to simplify the optimizer further, now includes the CBEOOptimizer2 class which does not even require crossing-over and history of best solutions. Yet the convergence time is quite low. However, I can't quite describe why its accidentally discovered mysterious "previous attempt intermix" operation works.
Global optimization is still beyond the reach of this strategy, but maybe I can find some solution eventually.
Global optimization is still beyond the reach of this strategy, but maybe I can find some solution eventually.
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
Evolutionary optimization strategy means the current solution is "evolved" to the next state from its previous state by means of random operators. These strategies also use crossing-over operation that involves mixing with ancestors or previous best solutions.camsr wrote:What is an evolutionary algorithm? I am none familiar with it.
Meanwhile, I was able to further improve the strategy. I also made comparisons to Nelder-Mead simplex method, and my strategy proves to be more robust (stably produces a solution from any initial state) and converges 2 times faster when optimizing most functions I've tried. The Nelder-Mead method in half of the cases stopped prematurely producing a sub-optimal solution. So, it's a win.
-
- KVRian
- 1379 posts since 26 Apr, 2004 from UK
Nelder Mead is a local optimization algorithm, whereas GA are global ones. The fact that they are global helps them in tricky valleys of the cost function IMHO, so not really surprised by this result
- KVRAF
- Topic Starter
- 4021 posts since 7 Sep, 2002
GAs are not equally good at global optimization. My strategy also strives to provide the global best solution, it is capable of that. The current problem is that when the global minimum's shape is very narrow and there are many local minimums, the strategy chooses a sub-optimal minimum - it just can't "hit" global minimum's position with a lower cost.Miles1981 wrote:Nelder Mead is a local optimization algorithm, whereas GA are global ones. The fact that they are global helps them in tricky valleys of the cost function IMHO, so not really surprised by this result
The problematic function I'm not struggling with is
Code: Select all
y = 100 * sqrt( fabs( y - 0.01 * x * x )) + 0.01 * fabs( x + 10 )
-
- KVRian
- 687 posts since 17 Sep, 2007 from Planet Thanet
It's been a while since I looked at this sort of stuff but it's always difficult to be sure of reaching the absolute optimal solution in difficult energy landscapes. I worked with GAs and found in the problem domain I was looking at (load balancing for finite element problems) took quite a bit of tinkering with mutation rates to climb out of local minima. Perhaps some sort of hybrid method would help i.e. use the evolutionary method for a few iterations and then switch to a local optimization scheme to reduce iterations required.