User-Defined Gestures with Physical Props in Virtual Reality

Loading...
Thumbnail Image

Date

2020-09-01

Authors

Moran Ledesma, Marco Aurelio

Advisor

Hancock, Mark
Schneider, Oliver

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

When building virtual reality (VR) environments, designers use physical props to improve immersion and realism. However, people may want to perform actions that would not be supported by physical objects, for example, duplicating an object in a Computer-Aided Design (CAD) program or darkening the sky in an open-world game. In this thesis, I present an elicitation study where I asked 21 participants to choose from 95 props to perform manipulative gestures for 20 referents (actions), typically found in CAD software or open-world games. I describe the resulting gestures as context-free grammars, capturing the actions taken by our participants, their prop choices, and how the props were used in each gesture. I present agreement scores between gesture choices and prop choices; to accomplish the latter, I developed a generalized agreement score that compares sets of selections rather than a single selection, enabling new types of elicitation studies. I found that props were selected according to their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support a wide variety of gestures.

Description

Keywords

input techniques, empirical study that tells us about how people use a system, touch/haptic/pointing/gesture, virtual reality, 3D physical props, agreement rate

LC Subject Headings

Citation