Solving the Rosenbrock Function Using an Evolutionary Algorithm

I’ve been interested in Evolutionary Algorithms ever since my college days. In pseudo-code:

create a population of solutions
loop many times
  pick two good solutions
  breed goods to make a child
  mutate child slightly
  replace a bad soln with child
  generate a new random soln
  replace a bad soln with random soln
end-loop
return best solution found

A few blog posts ago, I experimented with the Adam (adaptive moment estimation) algorithm. I used the Rosenbrock function as my test case. The Rosenbrock function looks very simple but it’s a difficult problem for most optimization algorithms. Rosenbrock’s function is defined as:

f(x, y) = = 100(y – x^2)^2 + (1 – x)^2

The function has solution x = 1 and y = 1, when f(x,y) = 0. Deceptively simple.



Rosenbrock’s function from the Wikipedia article on the topic



Anyway, one morning before work, I coded up an Evolutionary Algorithm and targeted Rosenbrock’s function. My results were both good and bad. My demo quickly found a pretty good solution of (1.004683, 1.009691) but I was hoping for something a bit closer to (1.0, 1.0) and furthermore, my algorithm required more hyperparameter tuning than I had expected.

I suspect that years from now (I’m not sure when) evolutionary algorithms will become more important than they are now. Most deep neural optimization techniques require Calculus gradients. Evolutionary algorithms are simple and do not require gradients, but they require far longer processing time than gradient-based techniques. If biologically plausible neural systems (such as neuromorphic networks) improve, and general computing power improves, then Evolutionary Algorithms wll become useful because most biologically plausible systems do not have Calculus gradients.

While I was writing this blog post, I came across a research paper explaining an optimization algorithm that I hadn’t heard of before called “Adaptive Coordinate Descent”. See http://www.loshchilov.com/publications/GECCO2011_AdaptiveCoordinateDescent.pdf for the paper. I gave the paper a quick scan but, possibly because the authors are French, found the paper very difficult to read. I’ll give the paper another look when I’m stuck on an airplane or something.



Fractals are essentially evolutionary art. Here are three computer-generated fractals that look biologically plausible.

This entry was posted in Machine Learning. Bookmark the permalink.