Date of Completion

9-22-2016

Embargo Period

9-22-2018

Keywords

VR, NUI, 3DUI, Machine Learning, Intent Inference

Major Advisor

Horea Ilies

Associate Advisor

Kristine Nowak

Associate Advisor

Krishna Pattipati

Associate Advisor

Jiong Tang

Associate Advisor

George Lykotrafitis

Field of Study

Mechanical Engineering

Degree

Doctor of Philosophy

Open Access

Open Access

Abstract

A virtual reality(VR) environment is defined as a computer generated representation of reality that is sensitive to the actions of its observer. As the computing power of our machines follows an ever growing trend, the simulation power of our VR applications and their impact on the development of our society continues to grow in a remarkable fashion. Along with our computing capabilities, the data that needs to be spatially manipulated continuously increases in size and diversity. To keep up with this trend of increasing complexity we need to develop new 3D user interfaces (3DUIs) that allow users to employ the full manipulative capabilities of their natural hand gestures when manipulating such data. Today we can approach this goal by tracking the natural hand gestures of our users and inferring their manipulative intentions. However, human natural hand gestures exhibit a large variability that is aggravated by hand placement inaccuracies and body tracking uncertainties. Additionally, there is a non-unique mapping between human gestures and the underlying manipulative intentions.

In this dissertation I lay out the foundation of a general manipulative intention inference framework. New metrics are proposed for quantifying a set of human behavioral cues that characterize general goal directed actions. The relationship between these behavioral cues and a user's manipulative intent is modeled using machine learning techniques in novel fashion. The practical value of these techniques is demonstrated by developing new virtual object manipulation methods that are driven by intention inference. By means of intention inference, the proposed interaction techniques automatically adapt to the user's subjective needs for various enhancements such as hand placement fault tolerance and hand positioning precision enhancement. The performance of the resulting virtual object manipulation techniques has been tested in a statistically significant manner by means of user studies.

The work presented here advances the state of the art in 3DUIs towards more user-friendly or even person centered user interfaces by developing user adaptable interfaces driven by intention inference. This can dramatically shorten the time required by a novice user to start performing efficient virtual object manipulations.

COinS