User-Defined Gestures with Physical Props in Virtual Reality
dc.contributor.author | Moran Ledesma, Marco Aurelio | |
dc.date.accessioned | 2020-09-01T17:03:57Z | |
dc.date.available | 2020-09-01T17:03:57Z | |
dc.date.issued | 2020-09-01 | |
dc.date.submitted | 2020-08-28 | |
dc.description.abstract | When building virtual reality (VR) environments, designers use physical props to improve immersion and realism. However, people may want to perform actions that would not be supported by physical objects, for example, duplicating an object in a Computer-Aided Design (CAD) program or darkening the sky in an open-world game. In this thesis, I present an elicitation study where I asked 21 participants to choose from 95 props to perform manipulative gestures for 20 referents (actions), typically found in CAD software or open-world games. I describe the resulting gestures as context-free grammars, capturing the actions taken by our participants, their prop choices, and how the props were used in each gesture. I present agreement scores between gesture choices and prop choices; to accomplish the latter, I developed a generalized agreement score that compares sets of selections rather than a single selection, enabling new types of elicitation studies. I found that props were selected according to their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support a wide variety of gestures. | en |
dc.identifier.uri | http://hdl.handle.net/10012/16209 | |
dc.language.iso | en | en |
dc.pending | false | |
dc.publisher | University of Waterloo | en |
dc.subject | input techniques | en |
dc.subject | empirical study that tells us about how people use a system | en |
dc.subject | touch/haptic/pointing/gesture | en |
dc.subject | virtual reality | en |
dc.subject | 3D physical props | en |
dc.subject | agreement rate | en |
dc.title | User-Defined Gestures with Physical Props in Virtual Reality | en |
dc.type | Master Thesis | en |
uws-etd.degree | Master of Applied Science | en |
uws-etd.degree.department | Systems Design Engineering | en |
uws-etd.degree.discipline | System Design Engineering | en |
uws-etd.degree.grantor | University of Waterloo | en |
uws.contributor.advisor | Hancock, Mark | |
uws.contributor.advisor | Schneider, Oliver | |
uws.contributor.affiliation1 | Faculty of Engineering | en |
uws.peerReviewStatus | Unreviewed | en |
uws.published.city | Waterloo | en |
uws.published.country | Canada | en |
uws.published.province | Ontario | en |
uws.scholarLevel | Graduate | en |
uws.typeOfResource | Text | en |