A simple 2D graphing tool for the convergence of fixed-point iterations and plug-and-play methods

Monday, October 14, 2019 - 4:00pm - 4:45pm
Keller 3-180
Wotao Yin (University of California, Los Angeles)
Have you ever fallen asleep while reading a convergence proof? Most convergence proofs are written in algebraic equalities and inequalities. They are rigorous but are often not an intuitive way to illustrate the core proof ideas.

The good news is, for nonexpansive operators that constitute popular optimization methods such as forward-backward, alternative projection, Douglas-Rachford, ADMM, and so on, there exists a simple 2D graphing tool, Scaled Relative Graph, that not only captures their core ideas but also serves as rigorous convergence proofs.

In this talk, we present Scaled Relative Graph and apply it to establish the convergence of two existing plug-and-play (PnP) methods. PnP integrates pre-trained deep networks (or other nonlinear operators) into forward-backward and ADMM frameworks. It is useful when we do not have a large amount of data for end-to-end training, so, for example, we can train a denoiser on natural images, and then “plug” it into ADMM for medical image reconstruction.

This talk has materials from joint work with Ernest Ryu and Jialin Liu from UCLA and Xiaohan Chen, Sichen Wang, and Zhangyang Wang from Texas A&M.
MSC Code: 
47H05, 47H09, 51M04, 90C25