# Talk:Metaheuristic Optimization

<review>message to curators</review>The new version is definitely an improvement over the previous one. I decided to perform a substantial number of minor edits directly in the text because it was much easier that way. Accordingly, the author should carefully revise the text because I might have introduced some errors or modified what he intended to say. Otherwise, I only have one remark:

In the description of the cooling schedule in simulated annealing, it is said that T tends toward 0 when t tends toward t_f for the chosen formula for \beta, but it seems to me that T rather tends toward T_f.

--

Based on the standard review terminology, I would say that this article requires a major revision. The reasons are the following:

- Tabu Search is not mentioned once. This is an important metaheuristic from an historical point of view, but also because it is widely used. Note, in particular, that the term "metaheuristic" was first coined by Fred Glover in his 1986 seminal paper, a fact that is worth mentioning.

- I understand that the most popular metaheuristics will have entries of their own, but still, some additional effort should be put in the description of Simulated Annealing, Genetic Algorithms, Ant Colony Optimization, Particle Swarm Optimization (and Tabu Search, of course).

- Too much emphasis is put on Firefly Algorithm and Cuckoo search, which are not widely used. The author should realize that it is not an overview of his own work, but rather an overview of the whole field of metaheuristics. Currently, the reader gets a partial and biased account of this field.

- There are many lexicographic and grammatical errors. This article should be revised by an English native (which should not be too difficult, given that the author is now at Cambridge University, UK). I would only like to stress the multiple occurrences of "L\’evy" in the text.

- Sometimes, it is difficult to know what the author means. For example:

"For stochastic algorithms, in general, we have two types: heuristic and metaheuristic, though their difference is small" ???

"This is in combination with the selection of the best solutions." Here, it is not clear what should be in combination with the selection of the best solutions.

"... but also keeps some changes that are not ideal." ???

"...the piecewise paths formed by positional vectors in a quasi-stochastic manner." ???

## Clarify what problems are best solved via metaheuristic optimization

I think the exposition could be strengthened to make it clear what problems benefit from these techniques. For example, the introduction says "Metaheuristic optimization concerns more generalized, nonlinear optimization problems." That is true, but misleading. The most important distinction in optimization is not linear vs nonlinear, but convex vs nonconvex. Almost all convex problems can be cast as conic programming using standard cones, and hence are solvable via a polynomial time algorithm, so they are in theory much faster to solve. In practice, they are alose much easier to solve as well. For a convex problem, you **never** want to use a metaheuristic technique.

On a similar note, I would suggest that 'convex relaxation' might fit into the metaheuristic category. It doesn't work for all problems, but for something like maxcut, there are famous results by Goemans–Williamson. For other categories like polynomial optimization, there are rigorous results about using convex sum-of-squares optimizations. And even when there are no provable results, convex relaxations can provide good initialization points, as well as lower bounds, for nonconvex problems. Stephen Becker-4 09:39, 6 December 2011 (UTC)