Hand gesture MR cover - landscape.jpg

Role

Individual Project

While at

RIOS

Company

Bob Frederick & Clancy Pearson

Time

May to Dec 2019

​Hand MR

Hand Gesture-based Interactions in a mixed-reality environment.

01. Intro

What I Did

Based on analyzing the needs of the company's architecture, landscape, interior, and product designers, I customized the hand-gesture MR interaction experience corresponding to different scenarios.

To fully utilize the ubiquitous nature of hand gestures, I worked primarily with Leap Motion hand tracking in MR, documenting how familiarity develops from direct, embodied engagement with novel hyper-physics, through a series of prototypes spanning body-environment fusion (in depiction and interaction), and multisensory integration.

This research project serves to exhibit a set of concerns and possibilities for spatial interactions, with the hope that such awareness might provoke/instill novel frameworks for conceiving spatial interactions and the role of embodiment in spatial computing.

02. Research Questions

Interactions

What are the most intuitive bodily interactions in MR?

Interfaces

How can we leverage different types of spatial interfaces?

Affordances

What affordances allow users to naturally grasp the interactions?

03. Outcome

Final demo

This experience is tailored for interior designers in the company. They are able to drag the model they are working on from the monitor to a real environment to test it out.

Drag & Drop

EXPERIENCE

Final demo

This experience is designed for the scenario when architects display proposals in a client meeting.

Manipulate it with bare hands

EXPERIENCE

04. Research Process

I. INTERACTIONS

Hand as input device

We use our hands to control the world around us - they are our best input device, but the ways we use them are dictated by the limits of our physical world - it makes sense therefore that we would use our hands to control the spatial computing environment.

I. INTERACTIONS

Still Gesture

Tracking points

Pose type

I. INTERACTIONS

Moving Gesture

II. INTERFACES

Wearable Interfaces

Now that digital content has become spatial-the ways in which we are able to interact with it and control it are completely different. I set out a goal to create new paradigms and metaphors to interact with that content. The first step was to understand the different types of ways content could manifest in order to understand the associated interactions with that content that could be designed.

II. INTERFACES

Body-anchored Interfaces

III. AFFORDANCE

Key to the sense of reality

III. AFFORDANCE

Selection of Head-mounted devices

After testing a series of mainstream mixed reality HMDs on the market including Hololens and magic leap, it is found that the problems are poor rendering, low opacity, and the lack of occlusion, which can’t make the content best integrated with the environment. Eventually, I chose the see-through solution (Oculus+Zed mini+Leap motion) even it's not as portable as other devices. 

III. AFFORDANCE

Components assembly

III. AFFORDANCE

Synchronization

Changes made in Rhino, Maya, 3D Max can be streamed to the object in the real environment simoutaneously.

POTTENTIAL

MR x IoT

Imagine that when your MR headset is connected to all the appliances at home, you can access all their functions and precisely control them through virtual wearables and basic gestures. With the assistance of this technology, the convenience of life will be significantly improved for more people with accessibility issues in the world. 

In the future of interactive technology, I can't help but imagine that the entity of devices will be minified to be invisible, and people will be equipped with the ability to affect reality with near naked-eye VR interaction which acts as the enlargement and extension of physical capabilities. It is like superpower. The new generation of interactive methods can enable this superpower to help make up for or even eliminate people's congenital physical defects.  

MR x IoT.jpg