Show simple item record

dc.contributor.authorHudson, Anna
dc.date.accessioned2018-08-31 16:33:09 (GMT)
dc.date.available2018-08-31 16:33:09 (GMT)
dc.date.issued2018-08-31
dc.date.submitted2018-08-22
dc.identifier.urihttp://hdl.handle.net/10012/13689
dc.description.abstractFacial expressions of emotion are a critical source of social information within the environment, but their interpretation is seen to be modulated based on the situational context in which they are presented. Most empirical work examining context on face expression processing have utilized visual cues only, with few studies examining cross-modal effects on face expression processing, despite auditory information being a second critical source of social information within the environment. The present study investigates the effect of positive and negative situational auditory (verbal) information on the identification of happy and angry face expressions. Both behavioural measures, and Event-Related Potentials (ERPs), derived from EEG recordings, were examined. Research has previously demonstrated ERP components elicited by the presentation of faces that are modulated simply by early visual attention (the P1 changes in facial expressions (the N170), emotional valence (the Early Posterior Negativity (EPN)) and the integration of these facets with contextual cues (Late Positive Potential (LPP)). In the present study, congruently paired positive sentences –happy face expressions received a cognitive gain such that reaction times were improved relative to all other conditions. Additionally, accuracy for congruent trials were significantly higher than incongruent trials for both happy and angry faces. Happy expressions elicited marginally enhanced P1, and larger N170 amplitude relative to angry faces. The EPN was more negative for angry relative to happy faces, which continued into the LPP as the counter positive enhancement. There was no interaction between sentence valence and face expression across any indices, potentially reflecting distinct neural networks for processing auditory information, or that the auditory information had been processed much earlier, with no modulation on the visual indices measured in this study.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectERPen
dc.subjectAuditory contexten
dc.subjectFace Expressionen
dc.subjectVisionen
dc.subjectSocial Cognitionen
dc.titleEffects of auditory context on face expression processing: An ERP Investigationen
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentPsychologyen
uws-etd.degree.disciplinePsychologyen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Artsen
uws.contributor.advisorItier, Roxane
uws.contributor.advisorHenderson, Heather
uws.contributor.affiliation1Faculty of Artsen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages