In this article we propose an algorithm, PARNES, for the basis pursuit
denoise problem which approximately finds a minimum one-norm solution to an
underdetermined least squares problem. PARNES, (1) combines what we think are
the best features of currently available methods SPGL1 and NESTA, and (2)
incorporates a new improvement that exhibits linear convergence under the
assumption of the restricted isometry property (RIP). As with SPGL1, our
approach 'probes the Pareto frontier' and determines a solution to the BPDN
problem by exploiting its relation between the LASSO problem as given by their
Pareto curve. As with NESTA we rely on the accelerated proximal gradient method
proposed by Yu. Nesterov that takes a remarkable O((L/e)^1/2) iterations to
come within e > 0 of the optimal value, where L is the Lipschitz constant of
the gradient of the objective function. Furthermore we introduce an 'outer
loop' that regularly updates the prox center. Nesterov's accelerated proximal
gradient method then becomes the 'inner loop'. The restricted isometry property
together with the Lipschitz differentiability of our objective function permits
us to derive a condition for switching between the inner and outer loop in a
provably optimal manner. A by-product of our approach is a new algorithm for
the LASSO problem that also exhibits linear convergence under RIP.
miércoles, 4 de noviembre de 2009
PARNES: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals. (arXiv:0911.0492v1 [math.OC])
Suscribirse a:
Entradas (Atom)