Show simple item record

dc.contributor.authorAvery, Jeff
dc.date.accessioned2018-12-21 18:46:38 (GMT)
dc.date.available2019-04-21 04:50:08 (GMT)
dc.date.issued2018-12-21
dc.date.submitted2018-12-17
dc.identifier.urihttp://hdl.handle.net/10012/14279
dc.description.abstractRecent technological advances have resulted in a major shift, from high-performance notebook and desktop computers -- devices that rely on keyboard and mouse for input -- towards smaller, personal devices like smartphones, tablets and smartwatches which rely primarily on touch input. Users of these devices typically have a relatively high level of skill in using multi-touch gestures to interact with them, but the multi-touch gesture sets that are supported are often restricted to a small subset of one and two-finger gestures, such as tap, double tap, drag, flick, pinch and spread. This is not due to technical limitations, since modern multi-touch smartphones and tablets are capable of accepting at least ten simultaneous points of contact. Likewise, human movement models suggest that humans are capable of richer and more expressive forms of interaction that utilize multiple fingers. This suggests a gap between the technical capabilities of multi-touch devices, the physical capabilities of end-users, and the gesture sets that have been implemented for these devices. Our work explores ways in which we can enrich multi-touch interaction on these devices by expanding these common gesture sets. Simple gestures are fine for simple use cases, but if we want to support a wide range of sophisticated behaviours -- the types of interactions required by expert users -- we need equally sophisticated capabilities from our devices. In this thesis, we refer to these more sophisticated, complex interactions as `enhanced gestures' to distinguish them from common but simple gestures, and to suggest the types of expert scenarios that we are targeting in their design. We do not need to necessarily replace current, familiar gestures, but it makes sense to consider augmenting them as multi-touch becomes more prevalent, and is applied to more sophisticated problems. This research explores issues of approachability and user acceptance around gesture sets. Using pinch-to-zoom as an example, we establish design guidelines for enhanced gestures, and systematically design, implement and evaluate two different types of expert gestures, illustrative of the type of functionality that we might build into future systems.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectmultitouchen
dc.subjectgestureen
dc.subjectevaluationen
dc.subjectinteractionen
dc.subjectsmartphoneen
dc.subjecttableten
dc.subjectpinch-to-zoomen
dc.titleEnhanced Multi-Touch Gestures for Complex Tasksen
dc.typeDoctoral Thesisen
dc.pendingfalse
uws-etd.degree.departmentDavid R. Cheriton School of Computer Scienceen
uws-etd.degree.disciplineComputer Scienceen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeDoctor of Philosophyen
uws-etd.embargo.terms4 monthsen
uws.contributor.advisorLank, Edward
uws.contributor.advisorVogel, Daniel
uws.contributor.affiliation1Faculty of Mathematicsen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages