REACT: REcourse Analysis with Counterfactuals and Explanation Tables

dc.contributor.authorAvksientieva, Anastasiia
dc.date.accessioned2025-05-21T15:21:20Z
dc.date.available2025-05-21T15:21:20Z
dc.date.issued2025-05-21
dc.date.submitted2025-05-12
dc.description.abstractMachine learning models often exhibit not only explicit bias–unequal performance metrics across subgroups–but also implicit bias, where altering a model’s prediction is disproportionately difficult across subgroups. In this work, we investigate two complementary approaches to analyze ways to overturn a model's decision to achieve a desired label: modifying test input features and unlearning a set of training samples. The novelty of our solution lies in combining these two methods with data summarization via informative rule mining that highlights biased subgroups. We demonstrate the value of REACT by allowing users to detect a model’s implicit bias and compare the biases of different model versions. The resulting framework is flexible, supporting the definition of practical constraints on feature-level interventions–for example, by limiting changes to modifiable attributes.
dc.identifier.urihttps://hdl.handle.net/10012/21759
dc.language.isoen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.titleREACT: REcourse Analysis with Counterfactuals and Explanation Tables
dc.typeMaster Thesis
uws-etd.degreeMaster of Mathematics
uws-etd.degree.departmentData Science
uws-etd.degree.disciplineData Science
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0
uws.contributor.advisorGolab, Lukasz
uws.contributor.affiliation1Faculty of Mathematics
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Avksientieva_Anastasiia.pdf
Size:
2.37 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: