TY - GEN

T1 - Robustifying FISTA via the infinity norm of its smooth component's gradient

AU - Rodriguez, Paul

N1 - Publisher Copyright:
© 2020 IEEE.

PY - 2020/11/1

Y1 - 2020/11/1

N2 - The FISTA is a well-known and fast procedure for solving optimization problems composed by the sum of two convex functions, i.e. F = f + g, such that ?f is L-Lipschitz continuous and g is possibly nonsmooth.FISTA's well-studied theoretical RoC (rate of convergence) is \mathcal{O}\left( {{k^{ - 2}}} \right); however, in the praxis, it depends on both, the extragradient rule and the step-size (SS) that estimates L. An ill-chosen SS (i.e. a large pre-defined constant), at worst, can force the objective to diverge; furthermore, some adaptive SS methods (i.e. line search, Cauchy, etc.) can slow down or force the objective to present an oscillatory behavior.In this work we present a simple add-on feature to robustify FISTA against an ill-chosen SS when F is the l1 regularized problem. It is based on modifying some entries of ?fk so as to \left\{ {{{\left\| {\nabla {f_k}} \right\|}_\infty }} \right\} is turned into a non-increasing sequence. Furthermore, tracking and limiting \left\{ {{{\left\| {\nabla {f_k}} \right\|}_\infty }} \right\} can be used (i) as an early warning method to avoid divergence k } and (ii) to allow larger or even consistently increasing SS sequences.Our computational results particularly target Convolutional Sparse Representations (CSR), where our method indeed boots FISTA's practical performance.

AB - The FISTA is a well-known and fast procedure for solving optimization problems composed by the sum of two convex functions, i.e. F = f + g, such that ?f is L-Lipschitz continuous and g is possibly nonsmooth.FISTA's well-studied theoretical RoC (rate of convergence) is \mathcal{O}\left( {{k^{ - 2}}} \right); however, in the praxis, it depends on both, the extragradient rule and the step-size (SS) that estimates L. An ill-chosen SS (i.e. a large pre-defined constant), at worst, can force the objective to diverge; furthermore, some adaptive SS methods (i.e. line search, Cauchy, etc.) can slow down or force the objective to present an oscillatory behavior.In this work we present a simple add-on feature to robustify FISTA against an ill-chosen SS when F is the l1 regularized problem. It is based on modifying some entries of ?fk so as to \left\{ {{{\left\| {\nabla {f_k}} \right\|}_\infty }} \right\} is turned into a non-increasing sequence. Furthermore, tracking and limiting \left\{ {{{\left\| {\nabla {f_k}} \right\|}_\infty }} \right\} can be used (i) as an early warning method to avoid divergence k } and (ii) to allow larger or even consistently increasing SS sequences.Our computational results particularly target Convolutional Sparse Representations (CSR), where our method indeed boots FISTA's practical performance.

UR - http://www.scopus.com/inward/record.url?scp=85107801409&partnerID=8YFLogxK

U2 - 10.1109/IEEECONF51394.2020.9443517

DO - 10.1109/IEEECONF51394.2020.9443517

M3 - Conference contribution

AN - SCOPUS:85107801409

T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers

SP - 341

EP - 342

BT - Conference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020

A2 - Matthews, Michael B.

PB - IEEE Computer Society

T2 - 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020

Y2 - 1 November 2020 through 5 November 2020

ER -