Campuses:

subgradient methods

Tuesday, January 26, 2016 - 9:00am - 9:50am
James Renegar (Cornell University)
Recently we introduced a framework for applying subgradient methods to solve general convex, conic optimization problems. The framework, once seen, is obvious, but had not appeared in the literature, a blind spot. Quite recently we posted a refinement of the framework in the special case of hyperbolic programming. Hyperbolicity cones have algebraic structure ideal for smoothing. Once a hyperbolic program is smoothed, virtually any accelerated method can be applied, which if done with care, results in a first-order algorithm with best-possible iteration bound.
Subscribe to RSS - subgradient methods