Optimizing costly functions with simple constraints: a limited-memory projected quasi-Newton algorithm

M. Schmidt, E. van den Berg, M. P. Friedlander, K. Murphy
Proceeding of the 12th International Conference on Artificial Intelligence and Statistics, 2009

PDF Slides Software AI Stats Best Paper

Abstract

An optimization algorithm for minimizing a smooth function over a convex set is described. Each iteration of the method computes a descent direction by minimizing, over the original constraints, a diagonal plus low-rank quadratic approximation to the function. The quadratic approximation is constructed using a limited-memory quasi-Newton update. The method is suitable for large-scale problems where evaluation of the function is substantially more expensive than projection onto the constraint set. Numerical experiments on one-norm regularized test problems indicate that the proposed method is competitive with state-of-the-art methods such as bound-constrained L-BFGS and orthant-wise descent. We further show that the method generalizes to a wide class of problems, and substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random fields.

BiBTeX

@InProceedings{SchmidtBergFriedlanderMurphy:2009,
  author =       {M. Schmidt, E. van den Berg, M. P. Friedlander,
                  K. Murphy},
  title =        {Optimizing Costly Functions with Simple Constraints:
                  A Limited-Memory Projected Quasi-Newton Algorithm},
  booktitle =    {Proceedings of The Twelfth International Conference
                  on Artificial Intelligence and Statistics (AISTATS)
                  2009},
  pages =        {456-463},
  year =         2009,
  editor =       {D. van Dyk and M. Welling},
  volume =       5,
  address =      {Clearwater Beach, Florida},
  month =        {April},
}