Print

Print


On Jul 22, 2013, at 3:46 PM, David vun Kannon wrote:

> Last week I had a quick read of your GECCO 2013 paper on using Meta-EA to directly find the optimum, instead of finding good parameters (what I'm calling Meta-EA For The Win). My first thought about how to do this in ECJ is to just make a set of subpopulations, each with a different set of parameter settings.  Is that on the right track? 

Well, you *could* do that I guess but it'd be hard to parallelize it (the crucial part) and you couldn't have a meta-meta-EA or a meta-meta-meta-EA.  :-)

The new version of ECJ has our meta-EA code built into it.  It's a single file!  A Problem subclass.  Here's what it does. It takes a DoubleVectorIndividual and decodes it into lower-level EA parameter settings (as strings) according to rules specified in the meta-EA parameter file.  Then it tests these parameter settings by firing up *another* ECJ run (inside the same Java process) with a parameter database that has those settings set.  After the run is done, it pulls out the best fitness of run and that's the fitness of the decoded DoubleVectorIndividual.  Works really well, plus it's completely compatible with distributed evaluation so you can do that decoding and testing on remote slave machines.

In case anyone is interested, here's the paper.
	http://cs.gmu.edu/~sean/papers/gecco13metaga.pdf

This was a pretty brutal test of ECJ's modularity and self-containment.  We used 129 processors to do a grand total of about 14 million evolutionary runs involving four different evolutionary algorithm styles.  Took a few weeks.  Big props to Khaled Talukder who was pretty instrumental in the whole thing.

Sean