Show simple item record

dc.contributor.authorDuan, Haonan
dc.date.accessioned2021-07-12 14:01:40 (GMT)
dc.date.available2021-07-12 14:01:40 (GMT)
dc.date.issued2021-07-12
dc.date.submitted2021-07-06
dc.identifier.urihttp://hdl.handle.net/10012/17140
dc.description.abstractWith recent advances in approximate inference, Bayesian methods have proven successful in larger datasets and more complex models. The central problem in Bayesian inference is how to approximate intractable posteriors accurately and efficiently. Variational inference deals with this problem by projecting the posterior onto a simpler distribution space. The projection step in variational inference is usually done by minimizing Kullback–Leibler divergence, but alternative methods may sometimes yield faster and more accurate solutions. Moments are statistics to describe the shape of a probability distribution, and one can project the distribution by matching a set of moments. The idea of moment matching dates back to the method of moments (MM), a simple approach to estimate unknown parameters by enforcing the moments to match with estimation. While MM has been primarily studied in frequentist statistics, it can lend itself naturally to approximate Bayesian inference. This thesis aims to better understand how to apply MM in general-purpose Bayesian inference problems and the advantage of MM methods in Bayesian inference. We begin with the simplest model in machine learning and gradually extend to more complex and practical settings. The scope of our work spans from theory, methodology to applications. We first study a specific algorithm that uses MM in mixture posteriors, Bayesian Moment Matching (BMM). We prove consistency of BMM in a naive Bayes model and then propose an initializer to Boolean SAT solvers based on its extension to Bayesian networks. BMM is quite restrictive and can only be used with conjugate priors. We then propose a new algorithm, Multiple Moment Matching Inference (MMMI), a general-purpose approximate Bayesian inference algorithm based on the idea of MM, and demonstrate its competitive predictive performance on real-world datasets.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectBayesian learningen
dc.subjectapproximate inferenceen
dc.subjectBayesian neural networksen
dc.subjectBoolean satisfiability problemen
dc.subjectmethod of momentsen
dc.titleMethod of Moments in Approximate Bayesian Inference: From Theory to Practiceen
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Mathematicsen
uws-etd.embargo.terms0en
uws.contributor.advisorPoupart, Pascal
uws.contributor.affiliation1Faculty of Mathematicsen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages