Difference between revisions of "Optimization"

From Sean_Carver
Jump to: navigation, search
(New page: First, we discuss homework from Lab V. Now on to today's topic. == What is Optimization == Given a function that depends upon parameters, find the value of the param...)
 
(No Free Lunch!)
Line 11: Line 11:
 
== No Free Lunch! ==  
 
== No Free Lunch! ==  
  
What is the best optimization routine to use?  The "no free lunch" theorem tells you that it doesn't exist; or rather it depends upon your problem. As a result there is a wide diversity of different algorithms that are used in practice.  Consider the following rather famous news group post that describes many of the algorithms in a funny way.  (OK so most of these are inside jokes that only make sense if you already know the algorithms, but the post at least gives you a flavor of the diversity of optimization algorithms.
+
What is the best optimization routine to use?  The "no free lunch" theorem tells you that it doesn't exist; or rather it depends upon your problem. As a result there is a wide diversity of different algorithms that are used in practice.  Consider the following rather famous news group post that describes many of the algorithms in a funny way.  (OK so most of these are inside jokes that only make sense if you already know the algorithms, but the post at least gives you a flavor of the diversity of optimization algorithms.  The post can be found [http://www.sasenterpriseminer.com/documents/Kangaroos%20and%20Training%20Neural%20Networks.txt here].

Revision as of 17:12, 20 April 2009

First, we discuss homework from Lab V.

Now on to today's topic.

What is Optimization

Given a function that depends upon parameters, find the value of the parameters that optimizes the function (e.g. if the function is likelihood, find the maximum likelihood).

Minimizing functions is trivially the same thing as maximizing functions. Most optimization routines are written to minimize function; if you want to maximize log-likelihood with such a function, you can just minimize minus-log-likelihood. Putting a minus sign in front of the function turns the surface upside down and cost next to nothing computationally.

No Free Lunch!

What is the best optimization routine to use? The "no free lunch" theorem tells you that it doesn't exist; or rather it depends upon your problem. As a result there is a wide diversity of different algorithms that are used in practice. Consider the following rather famous news group post that describes many of the algorithms in a funny way. (OK so most of these are inside jokes that only make sense if you already know the algorithms, but the post at least gives you a flavor of the diversity of optimization algorithms. The post can be found here.