Applications of Stochastic Gradient Descent to Nonnegative Matrix Factorization

dc.contributor.authorSlavin, Matthew
dc.date.accessioned2019-07-15T13:47:34Z
dc.date.available2019-07-15T13:47:34Z
dc.date.issued2019-07-15
dc.date.submitted2019-07-08
dc.description.abstractWe consider the application of stochastic gradient descent (SGD) to the nonnegative matrix factorization (NMF) problem and the unconstrained low-rank matrix factorization problem. While the literature on the SGD algorithm is rich, the application of this specific algorithm to the field of matrix factorization problems is an unexplored area. We develop a series of results for the unconstrained problem, beginning with an analysis of standard gradient descent with a known zero-loss solution, and culminating with results for SGD in the general case where no zero-loss solution is assumed. We show that, with initialization close to a minimizer, there exist linear rate convergence guarantees. We explore these results further with numerical experiments, and examine how the matrix factorization solutions found by SGD can be used as machine learning classifiers in two specific applications. In the first application, handwritten digit recognition, we show that our approach produces classification performance competitive with existing matrix factorization algorithms. In the second application, document topic classification, we examine how well SGD can recover an unknown words-to-topics matrix when the topics-to-document matrix is generated using the Latent Dirichlet Allocation model. This approach allows us to simulate two regimes for SGD: a fixed-sample regime where a large set of data is iterated over to train the model, and a generated-sample regime where a new data point is generated at each training iteration. In both regimes, we show that SGD can be an effective tool for recovering the hidden words-to-topic matrix. We conclude with some suggestions for further expansion of this work.en
dc.identifier.urihttp://hdl.handle.net/10012/14798
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectOptimizationen
dc.subjectData Scienceen
dc.subjectMachine Learningen
dc.titleApplications of Stochastic Gradient Descent to Nonnegative Matrix Factorizationen
dc.typeMaster Thesisen
uws-etd.degreeMaster of Mathematicsen
uws-etd.degree.departmentCombinatorics and Optimizationen
uws-etd.degree.disciplineCombinatorics and Optimizationen
uws-etd.degree.grantorUniversity of Waterlooen
uws.contributor.advisorVavasis, Stephen
uws.contributor.affiliation1Faculty of Mathematicsen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Slavin_Matthew.pdf
Size:
2.24 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.08 KB
Format:
Item-specific license agreed upon to submission
Description: