January 18, 2007

Duality between Probability and Optimization

Posted by David Corfield

One of the reasons I have an interest in what we find out about mechanics in different rigs is that many machine learning algorithms are expressible in thermodynamic form, as the tutorial, Energy-Based Models: Structured Learning Beyond Likelihoods, by Yann LeCun made abundantly clear. We even see there the difference between working in the $\mathbb{R}^{+}$ rig, where you keep the whole probabilistic apparatus, and the $\mathbb{R}^{min}$ rig, where you just seek an optimal solution. The former typically takes greater effort to work with. Only in special situations is it analytically tractable, so one might use Monte Carlo techniques instead. On the other hand, in the probabilistic set-up the regularization term, which stops the inference problem being ill-posed, generally comes in the package for free, encoded in a prior.

Outside machine learning, there’s been much effort spent importing constructions into $\mathbb{R}^{min}$, so that you might hear of the Duality between Probability and Optimization. For a very good overview of the uses of semirings such as $\mathbb{R}^{min}$, Jonathan Golan’s Some Recent Applications of Semiring Theory is recommended. I never knew about Sammy Eilenberg or John Horton Conway’s role in their revival around 1970.

Posted at January 18, 2007 8:21 PM UTC

TrackBack URL for this Entry:   http://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/1113