Inherent Limitations of Dimensions for Characterizing Learnability

dc.contributor.authorLechner, Tosca Kerstin
dc.date.accessioned2024-09-24T18:44:46Z
dc.date.available2024-09-24T18:44:46Z
dc.date.issued2024-09-24
dc.date.submitted2024-09-09
dc.description.abstractThe fundamental theorem of statistical learning establishes the equivalence between various notions of both agnostic and realizable Probably Approximately Correct (PAC) learnability for binary classification with the finiteness of the VC dimension. Motivated by these foundational results, this work explores whether similar characterizations of learnability and sample complexity can be defined through dimensions in other learning models. We introduce the concept of a scale-invariant dimension and demonstrate that for a range of learning tasks, including distribution learning, any such dimension fails to capture learnability. Additionally, we define Computable PAC (CPAC) learning and show that not all classes are properly computably PAC learnable, highlighting a significant limitation in classical PAC frameworks. Our analysis further reveals that, unlike binary classification, realizable learning does not always imply agnostic learning in settings such as distribution learning and CPAC learning. Finally, we address learning under conditions of data corruption, whether from adversaries or self-manipulating agents, and show that the extent of prior knowledge about manipulation capabilities can significantly affect learnability. We address various ways in overcoming uncertainty in manipulation capabilities by either learning manipulation capabilities from distribution shifts or further oracle access or by allowing abstentions. We also show that this uncertainty in manipulation capabilities can sometimes be overcome without additional oracle access in the realizable case while the agnostic case requires such resources. These findings further underscore that in order to characterize agnostic learnability it is not always sufficient to understand the realizable case.
dc.identifier.urihttps://hdl.handle.net/10012/21092
dc.language.isoen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.titleInherent Limitations of Dimensions for Characterizing Learnability
dc.typeDoctoral Thesis
uws-etd.degreeDoctor of Philosophy
uws-etd.degree.departmentDavid R. Cheriton School of Computer Science
uws-etd.degree.disciplineComputer Science
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0
uws.contributor.advisorBen- David, Shai
uws.contributor.affiliation1Faculty of Mathematics
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Lechner_Tosca.pdf
Size:
1.21 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: