Health (Faculty of)
Permanent URI for this communityhttps://uwspace.uwaterloo.ca/handle/10012/9860
Welcome to the Faculty of Health community.
The Faculty was known as the Faculty of Applied Health Sciences prior to its name change in January 2021.This community and it’s collections are organized using the University of Waterloo's Faculties and Academics structure. In this structure:
- Communities are Faculties or Affiliated Institutions
- Collections are Departments or Research Centres
Research outputs are organized by type (eg. Master Thesis, Article, Conference Paper).
New collections following this structure will be created UPON REQUEST.
Browse
Browsing Health (Faculty of) by Author "Barnett-Cowan, Michael"
Now showing 1 - 10 of 10
- Results Per Page
- Sort Options
Item Characterizing the dynamics of vestibular reflex gain modulation using balance-relevant sensory conflict(University of Waterloo, 2023-09-21) Goar, Megan; Horslen, Brian; Barnett-Cowan, MichaelElectrical vestibular stimulation (EVS) can be used to evoke reflexive body sways as a probe of vestibular control of balance. However, EVS introduces sensory conflict by decoupling vestibular input from actual body motion, prompting the central nervous system (CNS) to potentially perceive vestibular signals as less reliable. In contrast, light touch reduces sway by providing reliable feedback about body motion and spatial orientation. The juxtaposition of reliable and unreliable sensory cues enables exploration of multisensory integration during balance control. I hypothesized that when light touch is available, coherence and gain between EVS input and center of pressure (CoP) output would decrease as the CNS reduces the weighting of vestibular cues. Additionally, I hypothesized that the CNS would require less than 0.5 seconds to adjust weighting of sensory cues upon introduction or removal of light touch. In two experiments, participants stood as still as possible while receiving continuous stochastic EVS (with a frequency of 0-25 Hz, amplitude of ± 4 mA, and a duration of 200-300 seconds), while either: lightly touching a load cell (<2 N); holding their hand above a load cell; or intermittently switching between touching and not touching the load cell. Anterior-posterior (AP) CoP and linear accelerations from body-worn accelerometers were collected to calculate the root mean square (RMS) of AP CoP, as well as the coherence and gain between EVS input and AP CoP or acceleration outputs. Light touch led to a decrease in CoP RMS (mean 49% decrease) with and without EVS. Significant coherence between EVS and AP CoP was observed between 0.5 Hz and 24 Hz in the NO TOUCH condition, and between 0.5 Hz and 30 Hz in the TOUCH condition, with TOUCH having significantly greater coherence from 11 to 30 Hz. Opposite to coherence, EVS-AP CoP gain decreased in the TOUCH condition between 0.5-8 Hz (mean decrease 63%). Among the available acceleration data, only the head exhibited a significant increase in coherence above 10 Hz in the TOUCH condition, compared to the NO TOUCH condition. Light touch reduced CoP displacement, but increased variation in the CoP signal that can be explained by EVS input. Light touch may cause the CNS to attribute EVS signals to head movements and therefore up-weight vestibulocollic responses while downweighting vestibulospinal balance responses. Changes in coherence and gain started before the transition to the NO TOUCH condition and after the transition to the TOUCH condition. The loss of sensory information may be more destabilizing than addition, necessitating anticipatory adjustments. These findings demonstrate the ability of one sensory modality to modulate the utilization of another by the CNS, and highlight asymmetries in the timing of responses to the introduction and removal of sensory information, which may impact behavior.Item The influence of body orientation relative to gravity on egocentric distance estimates in immersive virtual environments(University of Waterloo, 2022-01-26) Martin Calderon, Claudia; Barnett-Cowan, MichaelVirtual reality head mounted displays (VR-HMD) are a flexible tool that can immerse individuals into a variety of virtual environments and can account for an individuals head orientation within these environments. Additionally, VR-HMD’s can allow participants to explore environments while maintaining different body positions (e.g sitting, and laying down). How these discrepancies between real world body position and virtual environment impact the perception of virtual space or, additionally, how a visual upright with incongruent changes in head orientation affects space perception within VR has not been fully defined. In this study we hoped to further understand how changes in orientation (laying supine, laying prone, laying on left side and, being upright) while a steady visual virtual upright (presented in the Oculus Rift DK1) is maintained can effect the perception of distance. We used a new psychophysics perceptual matching based approach with two different probe configurations (L- and T shape) in order to extract distance perception thresholds in the four previously mentioned positions at egocentric distances of 4, 5, and,6 meters. Our results indicate that changes in orientation with respect to gravity impact the perception of distances within a virtual environment when it is maintained at a visual upright. Particularly we found significant differences between perceived distances in the upright condition compared to the prone and laying on left side positions. Additionally, we found that distance perception results were impacted by differences in probe configuration. Our results add to a body of work looking at how changes in head and body orientation can affect the perception of distance, however, more research is needed in order to fully understand how these changes with respect to gravity are affecting the perception of space within these virtual environments.Item Multisensory Integrative Processes and Aging(University of Waterloo, 2022-10-04) Basharat, Aysha; Barnett-Cowan, MichaelOur sensory systems provide us with distinct impressions of our surroundings which are critical for perception, cognitive processing, and control of action. Indeed, input from multiple sensory stimuli compared to a single sensory stimulus increases the likelihood of detection, sensitivity, and the likelihood of correctly identifying the event. However, this process changes as we age. In this dissertation, I investigate the changes associated with auditory and visual integration in older adults by utilizing various psychophysical tasks. This dissertation aims to determine the following: (1) to understand the relation between behavioural tasks that are commonly utilized to investigate multisensory integration, (2) to investigate how performance on these tasks changes when the central nervous system is aroused or stressed through the use of exercise (both in-person and virtually), and (3) to investigate the limitations and shortcomings of the current practices in the multisensory integration literature. Results indicate that older adults are impaired in judging temporal order of events, however they also exhibit greater performance gains in response time to multisensory, compared to uni-sensory stimuli. Further, results reveal that the integration process is malleable and thus physical activity, both in-person and virtually, may be a useful intervention that can help to improve the speed, accuracy, and precision with which older adults integrate multisensory information. A scoping review concludes the dissertation, which reveals that only 60% and 50% of studies measure for age-abnormal hearing and vision respectively and that within these studies a consistent definition of what constitutes normal hearing and vision is not found.Item The Neural Processes of Perceived Simultaneity and Temporal Order in Younger and Older Adults using EEG.(University of Waterloo, 2017-08-02) Basharat, Aysha; Barnett-Cowan, MichaelIn order to make sense of the world, the central nervous system (CNS) must determine the temporal order of events and integrate cues that belong together. The process of integrating information from multiple sensory modalities is referred to as multisensory integration. The importance of this process is evident in everyday events such as speech communication or watching a movie. These events give rise to both auditory and visual sensations that are truly simultaneous or successive, which the CNS must determine. This thesis presents two experiments designed to determine how the CNS of younger and older adults processes audiovisual information to identify simultaneity and temporal order of events. 28 younger (experiment 1) and 28 older (experiment 2) adults participated in audiovisual tasks in which they were asked to decide whether audiovisual stimuli were presented simultaneously or successively (SJ) or which stimulus was presented first (TOJ). The probability of judging a light and a sound as occurring simultaneously, or whether a light occurred first were calculated to extract the point of subjective simultaneity (PSS) and the temporal binding window (TBW). The TBW represents the time within which auditory and visual cues are most likely perceived as being simultaneous. Event-related potentials (ERPs) time-locked to light and sound onset presented at 4 different stimulus onset asynchronies (SOAs) were also recorded. Results revealed task specific differences in perceiving simultaneity and temporal order, suggesting that each task may be subserved via different neural mechanisms. Auditory N1 and visual P1 ERP amplitudes confirmed that unisensory processing of audiovisual stimuli did not differ between the two tasks, indicating that performance differences between tasks arise from multisensory integration. Despite multisensory integration being implicated, the dissociation between SJ and TOJ was not revealed through auditory N1 and visual P1 amplitudes and latencies thus indicating that the decision-making role of higher-level networks may be contributing to the differences that exist between the two tasks. Consistent with previous literature, behavioural data tended towards older adults having a wider TBW than younger adults. While all participants had reported normal audition and vision, older adults showed a later visual P1 latency indicating that unisensory processing of visual information may be delayed with age. Compared to younger adults, older adults showed a sustained higher FCz auditory N1 ERP amplitude response across SOAs, which could correspond with broader response properties expected from an extended TBW. Together, this thesis provides compelling evidence that different neural mechanisms sub serve the SJ and TOJ tasks and that simultaneity and temporal order perception change with age.Item Perceived Timing of Active Head Movements at Different Speeds(University of Waterloo, 2018-06-11) Sachgau, Carolin; Barnett-Cowan, MichaelThe central nervous system must determine which sensory events occur at the same time. Actively moving the head corresponds with large changes in the relationship between the observer and the environment, sensorimotor processing, and spatiotemporal perception. Numerous studies have shown that head movement onset must precede the onset of other sensory events in order to be perceived as simultaneous, indicating that head movement perception is slow. In addition, active head movement perception has been shown to be dependent on head movement velocity in that head movement perception is slower when the head moves faster. However, these findings were obtained between-subjects, so they can only be interpreted as participants who move their head faster than other participants require the head to move even earlier than comparison stimuli to be perceived as simultaneous. Previous findings cannot address the question of whether active head movement perception changes at higher speeds. The present study used a within-subjects design to measure the point of subjective simultaneity (PSS) between active head movement speeds and a comparison sound stimulus to properly characterize the correlation between the velocity and perception of head movement onset. Our results clearly show that i) head movement perception is faster when the head moves faster within-subjects, ii) active head movement onset must still precede the onset of other sensory events (Average PSS: -123ms to -52ms) in order to be perceived as occurring simultaneously even at the fastest speeds (Average peak velocity: 76 deg/s to 257 deg/s). We conclude that head movement perception is slow, but that this delay is minimized with increased speed.Item Persistent perceptual delay for head movement onset relative to sound onset with and without vision(University of Waterloo, 2017-02-15) Chung, William; Barnett-Cowan, MichaelKnowing when the head moves is crucial information for the central nervous system in order to maintain a veridical representation of the self in the world for perception and action. Our head is constantly in motion during everyday activities, and thus the central nervous system is challenged with determining the relative timing of multisensory events that arise from active movement of the head. The vestibular system plays an important role in the detection of head motion as well as compensatory reflexive behaviours geared to stabilizing the self and the representation of the world. Although the transduction of vestibular signals is very fast, previous studies have found that the perceived onset of an active head movement is delayed when compared to other sensory stimuli such as sound, meaning that head movement onset has to precede a sound by approximately 80ms in order to be perceived as simultaneous. However, this past research has been conducted with participants’ eyes closed. Given that most natural head movements occur with input from the visual system, could perceptual delays in head movement onset be the result of removing visual input? In the current study, we set out to examine whether the inclusion of visual information affects the perceived timing of vestibular-auditory stimulus pairs. Participants performed a series of temporal order judgment tasks between their active head movement and an auditory tone presented at various stimulus onset asynchronies. Visual information was either absent (eyes-closed) or present while either maintaining fixation on an earth or head-fixed LED target in the dark or in the light. Our results show that head movement onset has to precede a sound with eyes-closed. The results also suggest that head movement onset must still precede a sound when fixating targets in the dark with a trend for the head having to move with less lead time with visual information and with the VOR active or suppressed. Together, these results suggest perception of head movement onset is persistently delayed and is not fully resolved with full field visual input.Item Psychometric correlates of multisensory integration as potential predictors of cybersickness in virtual reality(University of Waterloo, 2019-08-16) Sadiq, Ogai; Barnett-Cowan, MichaelHumans are constantly presented with rich sensory information through the environment that the central nervous system (CNS) must process to form a coherent perception of the world. While the CNS may be efficient in doing so in natural environments, human-made environments such as virtual reality (VR) pose challenges for the CNS to integrate multisensory information. While VR systems are becoming widely used in various fields, it often causes cybersickness in users. Cybersickness may be due to temporal discrepancies in visually updating the environment after a movement. We sought to assess whether individual differences in the parameters of temporal order judgement of multisensory cues are related to cybersickness. We tested 50 participants in two different tasks. The first task involved two temporal order judgements, 1) an audio-visual (AV) and 2) an audio-active head movement (AAHM) task where participants were presented with a sound paired with a visual or head movement stimulus at different stimulus onset asynchronies. The second task involved exploration of two VR experiences for 30 minutes each where participants’ cybersickness was quantified every 2 minutes on the fast motion sickness scale and also at the end of the 30-minute period using the simulator sickness questionnaire (SSQ). Participants’ visual acuity was also assessed. Results demonstrate that there is a positive correlation between total SSQ scores and the temporal binding window (TBW) and point of subjective simultaneity (PSS) measures. These indicate that individuals with wider AV TBWs or larger PSS measures may be more susceptible to cybersickness. We also find that individuals with higher visual acuity report lower sickness symptoms which is contrary to previous studies. Results from such findings will generate a better understanding of cybersickness in VR which in turn can be used for future development of virtual environments so as to be able to minimize discomfort.Item The Relationship Between Heart Rate and Self-reported Sickness in Commercially Available Virtual Reality Environments(University of Waterloo, 2020-05-13) Izadi Sokhtabandani, Siyavash; Barnett-Cowan, MichaelVirtual reality (VR), while beneficial in research, training, and entertainment has the tendency of causing cybersickness (CS). The symptoms range from mild to severe depending on the individual. There exists a gap in the academic literature regarding the combination of physiological and subjective measures of CS. It is currently unknown whether there is a relationship between these different measures and whether they can be used to predict CS before symptom development. A total of 18 young healthy adults were collected. Participants explored a CS inducing VR game ADR1FT (AD) for up to a maximum of 30 minutes twice. In one condition they were asked to rate their sickness levels based on the Fast Motion Sickness (FMS) scale every 2 minutes, and the next condition they were only asked to rate their sickness only at the beginning and end, while their heart rate (HR) was recorded. It was seen that both FMS and HR increased with prolonged exposure to VR. A paired t-test did not find the final FMS scores following the two conditions to be statistically different, suggesting that continuously asking for perceived ratings of sickness did not bias the participants to report a higher final FMS score. Additionally, heterogenous individual responses in FMS and HR revealed those that could be considered as “responders” and “non-responders,” suggesting that response to CS could be bimodal: slow and fast responders. We suggest that these results can be explained by sensory conflict theory, where in discrepancy between visual and vestibular inputs for self-motion, effect both subjective and physiological response.Item Sensory Conflict: Effects on the Perceived Onset of Motion and Cybersickness in Virtual Reality(University of Waterloo, 2023-01-03) Chung, William; Barnett-Cowan, MichaelThe perception of self-motion involves the integration of multisensory information, however there are scenarios in which the sensory feedback we receive from these different sources can conflict with one another. For example, when inside the cabin of a ship at sea or playing a game in virtual reality (VR), sensory signals for self-motion from the visual and vestibular systems may not be congruent. It has been well documented that such scenarios are associated with feelings of discomfort and alterations in our perception of motion, but the mechanisms leading to these perceptual consequences remain uncertain. The goal of this dissertation is to explore the effect of sensory conflict between vestibular and visual signals on the perception of self-motion and implications for cybersickness. Chapter Two examined the effect of sensory conflict on the perceived timing of a passive whole-body rotation paired with both congruent and incongruent visual feedback using VR. It was found that the visual signal only influenced the perception of movement onset when the direction of the visual motion did not match the expected equal and opposite response relative to physical rotation. In Chapter Three, the effect of sensory conflict between visual, vestibular and body cues on the perceived timing of visual motion was explored. The results revealed that changing the orientation of the body relative to gravity to dissociate the relationship between vestibular and body cues of upright delays the perceived onset of visual yaw rotation in VR by an additional 30ms compared to an upright posture. Lastly, Chapter Four investigated the relationship between sensory conflict and sensory reweighting through measures of cybersickness and sensory perception after exposure to VR gameplay. The results indicated that the perception of subjective vertical was significantly influenced by an intense VR experience and that sensory reweighting may play a role in this effect, along with providing a potential explanation for individual differences for cybersickness severity. Altogether, this dissertation highlights some of the perceptual consequences of sensory conflict between vestibular and visual signals and provides insights for the potential mechanisms that determine the perception of self-motion and cybersickness in VR.Item Visual determinants of postural control and perception during physical and visual motion(University of Waterloo, 2023-01-30) McIlroy, Robert; Barnett-Cowan, MichaelThe control of balance and posture is a critical task of daily life to limit the risk of falls and potential injury. In order to be successful in the control of balance the central nervous system utilizes sensory feedback from the visual, proprioceptive/somtatosensory and vestibular systems. It is through the detection, processing and perception of these sensory cues that allow us to form an accurate representation of postural events and respond accordingly. In this dissertation I investigate how we perceive postural events, how this perception can change with altered visual cues introduced through virtual reality and how virtual visual motion with differing context can alter postural responses. This dissertation aims to determine the following: (1) to determine if methodological changes effect an individuals perception of postural instability onset, (2) to investigate if visual information can alter our perception of instability onset, (3) to investigate if visual motion with differing visual characteristics can alter postural responses. Results indicate that the methodology utilized during a temporal order judgement task has an effect on the perception of postural instability onset. Additionally, it was observed that virtual visual height impacts the precision of perceptual responses to postural instability onset. Finally, virtual visual motion with differing visual context appeared to only be affected by visual motion duration. However, there were also strong individual differences in postural responses to visual motion, which has not been broadly addressed in the literature. As a whole this thesis can exemplify the importance of visual information on both perceptual and behavioural responses related to posture and balance.