This is an excellent summary, written with the author's usual succinctness.
I have just one minor point to raise: For large networks, it makes a big difference whether the barriers confining a system to the neighborhood of a particular fixed point grow with the size of the network. For example, for the case with all T_jk = T_0 > 0, there are just two fixed points (all up and all down) and the barriers grow proportional to the size of the system. On the other hand, if the couplings have the opposite sign, there can be many fixed points, with small barriers between them. With finite noise, this difference is important: We are generally interested in large networks which are stabilized by their size (this is implicit in the notion of "collective computation"). This point seems worth mentioning, if space allows.
This is a very pleasantly written, succinct review. Just two comments:
1) Would it be possible to include more precise citations in the text as to which article in the reference list discusses which issue (binary vs. continuous variables, optimization etc.)?
2) Also, one might consider including some reference to the related area of automata networks (e.g. Fogelman-Robert-Tchuente, "Automata Networks in Computer Science", Princeton UP 1987, or Goles-Martinez, "Neural and automata networks", Kluwer 1990). There very similar discrete-time dynamical models have been studied, starting from different considerations but concluding with quite analogous techniques and results. (Only with respect to analysis of the dynamics, of course, not for associative memory or optimization.)