Theory and Results on Restarting Schemes for Accelerated First Order Methods

dc.contributor.authorPavlovic, Viktor
dc.date.accessioned2024-09-10T14:28:29Z
dc.date.available2024-09-10T14:28:29Z
dc.date.issued2024-09-10
dc.date.submitted2024-08-26
dc.description.abstractComposite convex optimization problems are abundant in industry, and first order methods to solve them are growing in popularity as the size of variables reaches billions. Since the objective function could be possibly non-smooth, proximal gradient methods are one of the main tools for these problems. These methods benefit from acceleration, which uses the memory of past iterates to add momentum to the algorithms. Such methods have a O(1/k^2) convergence rate in terms of function value where k is the iteration number. Restarting algorithms has been seen to help speed up algorithms. O'Donoghue and Candes introduced adaptive restart strategies for accelerated first order methods which rely on easy to compute conditions, and indicate a large performance boost in terms of convergence. The restart works by resetting the momentum gained from acceleration. Their strategies in general are a heuristic, and there is no proof of convergence. In this thesis we show that restarting with the O'Donoghue and Candes condition improves the standard convergence rate in special cases. We consider the case of one-dimensional functions where we prove that the gradient based restart strategy from O'Donoghue and Candes improves the O(1/k^2) bound. We also study the restarting scheme applied to the method of alternating projections (MAP) for two closed, convex, and nonempty sets. It is shown in Chapter 6 that MAP falls into the convex composite paradigm and therefore acceleration can be applied. We study the case of MAP applied to two hyperplanes in arbitrary dimension. Furthermore we make observations as to why the restarts help, what makes a good restart condition, as well as what is needed to make progress in the general case.
dc.identifier.urihttps://hdl.handle.net/10012/20979
dc.language.isoen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectFISTA
dc.subjectconvex optimization
dc.subjectfirst order methods
dc.subjectaccelerated gradient descent
dc.titleTheory and Results on Restarting Schemes for Accelerated First Order Methods
dc.typeMaster Thesis
uws-etd.degreeMaster of Mathematics
uws-etd.degree.departmentCombinatorics and Optimization
uws-etd.degree.disciplineCombinatorics and Optimization
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0
uws.contributor.advisorVavasis, Stephen
uws.contributor.advisorMoursi, Walaa
uws.contributor.affiliation1Faculty of Mathematics
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Pavlovic_Viktor.pdf
Size:
563.8 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: