Visual Inputs and Motor Outputs as Indivduals Walk Through Dynamically Changing Environments

dc.contributor.authorCinelli, Michael
dc.date.accessioned2006-12-01T21:19:36Z
dc.date.available2006-12-01T21:19:36Z
dc.date.issued2006-12-01T21:19:36Z
dc.date.submitted2006-08-24
dc.description.abstractWalking around in dynamically changing environments require the integration of three of our sensory systems: visual, vestibular, and kinesethic. Vision is the only modality of these three sensory systems that provides information at a distance for proactively controlling locomotion (Gibson, 1958). The visual system provides information about self-motion, about body position and body segments relative to one another and the environment, and environmental information at a distance (Patla, 1998). Gibson (1979) developed the idea that everyday behaviour is controlled by perception-action coupling between an action and some specific information picked up from the optic flow that is generated by that action. Such that visual perception guides the action required to navigate safely through an environment and the action in turn alters perception. The objective of my thesis was to determine how well perception and action are coupled when approaching and walking through moving doors with dynamically changing apertures. My first two studies were grouped together and here I found that as the level of threat increased, the parameters of control changed and not the controlling mechanism. The two dominant action control parameters observed were a change in approach velocity and a change in posture (i.e. shoulder rotation). These findings add to previous work done in this area using a similar set-up in virtual reality, where after much practice participants increased success rate by decreasing velocity prior to crossing the doors. In my third study I found that visual fixation patterns and action parameters were similar when the location of the aperture was predictable and when it was not. Previous work from other researchers has shown that vision and a subsequent action are tightly coupled with a latency of about 1second. I have found that vision only tightly couples action when a specific action is required and the threat of a collision increases. My findings also point in the same direction as previous work that has shown that individuals look where they are going. My last study was designed to determine if we go where we are looking. Here I found that action does follow vision but is only loosely correlated. The most important and common finding from all the studies is that at 2 seconds prior to crossing the moving doors (any type of movement) vision seems to have the most profound effect on action. At this time variability in action is significantly lower than at prior times. I believe that my findings will help to understand how individuals use vision to modify actions in order to avoid colliding with other people or other moving objects within the environment. And this knowledge will help elderly individuals to be better able to cope with walking in cluttered environments and avoid contacting other objects.en
dc.format.extent14225907 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/10012/2610
dc.language.isoenen
dc.pendingfalseen
dc.publisherUniversity of Waterlooen
dc.subjecthuman locomotionen
dc.subjectmotor behavioursen
dc.subjectvisual behavioursen
dc.subjectperceptionen
dc.subject.programKinesiology (Behavioural Neuroscience)en
dc.titleVisual Inputs and Motor Outputs as Indivduals Walk Through Dynamically Changing Environmentsen
dc.typeDoctoral Thesisen
uws-etd.degreeDoctor of Philosophyen
uws-etd.degree.departmentKinesiologyen
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
wholeManuscript.pdf
Size:
13.57 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
144 B
Format:
Item-specific license agreed upon to submission
Description: