Inferential Role Semantics for Natural Language
dc.contributor.author | Blouw, Peter | |
dc.date.accessioned | 2017-08-22T15:56:05Z | |
dc.date.available | 2017-08-22T15:56:05Z | |
dc.date.issued | 2017-08-22 | |
dc.date.submitted | 2017-08-03 | |
dc.description.abstract | The most general goal of semantic theory is to explain facts about language use. In keeping with this goal, I introduce a framework for thinking about linguistic expressions in terms of (a) the inferences they license, (b) the behavioral predictions that their uses thereby sustain, and (c) the affordances that they provide to language users in virtue of these inferential and predictive involvements. Within this framework, linguistic expressions acquire meanings by regulating social practices that involve “intentional interpretation,” wherein people explain and predict one another’s behavior through linguistically specified mental state attributions. Developing a theory of meaning therefore requires formalizing the inferential roles that determine how linguistic expressions license predictions in the context intentional interpretation. Accordingly, the view I develop is an inferential role semantics for natural language. To describe this semantics, I take advantage of recently developed techniques in the field of natural language processing. I introduce a model that assigns inferential roles to arbitrary linguistic expressions by learning from examples of how sentences are distributed as premises and conclusions in a space of possible inferences. I then empirically evaluate the model’s ability to generate accurate entailments for novel sentences not used as training examples. I argue that this model takes a small but important step towards codifying the meanings of the expressions it manipulates. Next, I examine the theoretical implications of this work with respect to debates about the compositionality of language, the relationship between language and cognition, and the relationship between language and the world. With respect to compositionality, I argue that the debate is really about generalization in language use, and that the required sort of generalization can be achieved by “interpolating” between familiar examples of correct inferential transitions. With respect to the relationship between thought and language, I argue that it is a mistake to try to derive a theory of natural language semantics from a prior theory of mental representation because theories of mental representation invoke the sort of intentional interpretation at play in language use from the get-go. With respect to the relationship between language and the world, I argue that questions about truth conditions and reference relations are best thought of in terms of questions about the norms governing language use. These norms, in turn, are best characterized in primarily inferential terms. I conclude with an all-things-considered evaluation of my theory that demonstrates how it overcomes a number of challenges associated with semantic theories that take inference, rather than reference, as their starting point. | en |
dc.identifier.uri | http://hdl.handle.net/10012/12170 | |
dc.language.iso | en | en |
dc.pending | false | |
dc.publisher | University of Waterloo | en |
dc.subject | semantics | en |
dc.subject | neural networks | en |
dc.subject | compositionality | en |
dc.subject | mental representation | en |
dc.subject | meaning | en |
dc.subject | intentional interpretation | en |
dc.title | Inferential Role Semantics for Natural Language | en |
dc.type | Doctoral Thesis | en |
uws-etd.degree | Doctor of Philosophy | en |
uws-etd.degree.department | Philosophy | en |
uws-etd.degree.discipline | Philosophy | en |
uws-etd.degree.grantor | University of Waterloo | en |
uws.contributor.advisor | Eliasmith, Chris | |
uws.contributor.affiliation1 | Faculty of Arts | en |
uws.peerReviewStatus | Unreviewed | en |
uws.published.city | Waterloo | en |
uws.published.country | Canada | en |
uws.published.province | Ontario | en |
uws.scholarLevel | Graduate | en |
uws.typeOfResource | Text | en |