Tuesday, May 12, 2009

Joint distributions aren't that bad

I was reading Noam Nisan's exposition of correlated equilibria. It's interesting to note that finding a correlated equilibrium is easy: you have to solve a linear program in the relevant variables of the underlying distribution. This is in contrast to Nash equilibria.

What struck me is that this is a good example of where a joint distribution isn't that bad. In machine learning especially, trying to "learn" a joint distribution is hard firstly because the number of variables blows up, and because of nasty correlations that one has to account for. In fact, the area of variational methods often uses the trick of replacing a joint distribution by the "closest" product distribution, and optimizing over that instead.

Here though, the "replacing by a product distribution" takes you from a correlated equilibrium problem to a Nash equilibrium problem, and now the individual probabilities actually multiply, yielding a nonlinear problem that's PPAD-Complete.
Post a Comment

Disqus for The Geomblog