Show simple item record

dc.contributor.authorGreene, Eugene Dominic
dc.date.accessioned2011-08-29 16:19:21 (GMT)
dc.date.available2011-08-29 16:19:21 (GMT)
dc.date.issued2011-08-29T16:19:21Z
dc.date.submitted2011
dc.identifier.urihttp://hdl.handle.net/10012/6161
dc.description.abstractDirect interaction in virtual environments can be realized using relatively simple hardware, such as standard webcams and monitors. The result is a large gap between the stimuli existing in real-world interactions and those provided in the virtual environment. This leads to reduced efficiency and effectiveness when performing tasks. Conceivably these missing stimuli might be supplied through a visual modality, using sensory substitution. This work suggests a display technique that attempts to usefully and non-detrimentally employ sensory substitution to display proximity, tactile, and force information. We solve three problems with existing feedback mechanisms. Attempting to add information to existing visuals, we need to balance: not occluding the existing visual output; not causing the user to look away from the existing visual output, or otherwise distracting the user; and displaying as much new information as possible. We assume the user interacts with a virtual environment consisting of a manually controlled probe and a set of surfaces. Our solution is a pseudo-shadow: a shadow-like projection of the user's probe onto the surface being explored or manipulated. Instead of drawing the probe, we only draw the pseudo-shadow, and use it as a canvas on which to add other information. Static information is displayed by varying the parameters of a procedural texture rendered in the pseudo-shadow. The probe velocity and probe-surface distance modify this texture to convey dynamic information. Much of the computation occurs on the GPU, so the pseudo-shadow renders quickly enough for real-time interaction. As a result, this work contains three contributions: a simple collision detection and handling mechanism that can generalize to distance-based force fields; a way to display content during probe-surface interaction that reduces occlusion and spatial distraction; and a way to visually convey small-scale tactile texture.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectComputer Graphicsen
dc.subjectHuman-Computer Interactionen
dc.titleAugmenting Visual Feedback Using Sensory Substitutionen
dc.typeMaster Thesisen
dc.pendingfalseen
dc.subject.programComputer Scienceen
uws-etd.degree.departmentSchool of Computer Scienceen
uws-etd.degreeMaster of Mathematicsen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages