Click here to flash read.
We study a class of non-convex and non-smooth problems with \textit{rank}
regularization to promote sparsity in optimal solution. We propose to apply the
proximal gradient descent method to solve the problem and accelerate the
process with a novel support set projection operation on the singular values of
the intermediate update. We show that our algorithms are able to achieve a
convergence rate of $O(\frac{1}{t^2})$, which is exactly same as Nesterov's
optimal convergence rate for first-order methods on smooth and convex problems.
Strict sparsity can be expected and the support set of singular values during
each update is monotonically shrinking, which to our best knowledge, is novel
in momentum-based algorithms.
Click here to read this post out
ID: 295917; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: July 28, 2023, 7:32 a.m.
Changes:
Dictionaries:
Words:
Spaces:
Views: 32
CC:
No creative common's license
No creative common's license
Comments: