User-Defined Gestures with Physical Props in Virtual Reality
MetadataShow full item record
When building virtual reality (VR) environments, designers use physical props to improve immersion and realism. However, people may want to perform actions that would not be supported by physical objects, for example, duplicating an object in a Computer-Aided Design (CAD) program or darkening the sky in an open-world game. In this thesis, I present an elicitation study where I asked 21 participants to choose from 95 props to perform manipulative gestures for 20 referents (actions), typically found in CAD software or open-world games. I describe the resulting gestures as context-free grammars, capturing the actions taken by our participants, their prop choices, and how the props were used in each gesture. I present agreement scores between gesture choices and prop choices; to accomplish the latter, I developed a generalized agreement score that compares sets of selections rather than a single selection, enabling new types of elicitation studies. I found that props were selected according to their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support a wide variety of gestures.
Cite this version of the work
Marco Aurelio Moran Ledesma (2020). User-Defined Gestures with Physical Props in Virtual Reality. UWSpace. http://hdl.handle.net/10012/16209