Skip to content

Background

André Zenner edited this page Apr 8, 2021 · 7 revisions

What is Hand Redirection?

Hand Redirection is a technique that cleverly takes advantage of how the human perceptual system works (specifically, of visual dominance) in order to let virtual reality (VR) applications "control" where users reach with their hand in the real world while they reach for a target (i.e. move their virtual hand) in the virtual environment. In other words, the goal of hand redirection is to redirect a person's hand while it is in motion - letting a user grasp a virtual object with the virtual hand, while actually grasping a (potentially displaced!) physical object with the real hand.

In a VR system, people can only see a virtual representation of their hand. Normally, this virtual hand seems to be at the exact same position as the physical hand. But we can manipulate the movement of a person's virtual hand up to a certain degree without the user noticing it. This works, because visual feedback often dominates the human senses when there's a mismatch between the virtual and the real world (a perceptual phenomenon known as visual dominance). For example, if we shift a user's virtual hand slightly to the right while it is in motion, the user wants to compensate this shift by moving the real hand further to the left side. This results in a discrepancy between the virtual and the real hand. This example leads the user's virtual hand to the right whereas the real hand moves only forwards. You can see this example in the animation below. Blue represents the real hand/object and green the virtual hand/object.

body_warping

This technique is called Body Warping. But there are more approaches of hand redirection. Another approach keeps the mapping between the virtual and real hand but manipulates the user's orientation and position. By adding further or fewer degrees to a head rotation, while a user is rotating his head, we can change his orientation. Similar we can translate head movements upwards and downwards while the user moves his head or body. This results in a discrepancy between the real and virtual world. For example, if we want a user to rotate around 90 degrees in the virtual world but around 180 degrees in the real world, we scale the user's head rotation down by 50%. This technique is called World Warping and is similar to the idea of redirected walking.

If we now want to receive haptic feedback for the virtual objects in a VR environment, we need to have a physical counterpart in the real world - a concept known as passive haptic feedback or proxy-based haptics. But creating a one-to-one mapping between all virtual objects and physical objects hardly scales. Instead, it is often desirable to reuse one physical object (prop) for multiple virtual objects. This idea is called Haptic Retargeting. To implement haptic retargeting, hand redirection techniques can be used. When reaching for a virtual object represented by a prop in the real world, hand redirection is used to redirect a user's hand such that the real hand touches the physical prop while the virtual hand touches the (potentially displaced) virtual counterpart.

Why do we need a toolkit for this?

Unity has become increasingly popular in Human-Computer Interaction (HCI) research in the last years. It is often used as a platform to create and perform Hand Redirection studies. Additional, it supports nearly all Virtual Reality (VR) and Augmented Reality (AR) devices natural or gives the opportunity to easily install addons for them. Because of those reasons Unity is our first choice.

Creating a new hand redirection approach does not only involve finding and computing a new solution but also to build a complete test environment for this approach. This is very time consuming and ends often in hardcoded scenarios that do work well for one approach but does not guarantee an easy extension for other approaches.

With our hand redirection toolkit, we want to provide a modular working system, that allows a user to create a new hand redirection approach and test it in a predefined test environment. Users can compare it with existing approaches on the fly and only have to insert their algorithm into a given framework. Then they can already use it in a couple of example scenarios or create their own scenarios.
Sometimes, for smaller changes it is very annoying to set up the VR setup again, so, our toolkit gives the opportunity to visualize all body and hand movements by only using the keyboard and mouse. This can help a lot in pre-testing scenarios.
Also, many existing redirection approaches do build upon each other. A platform, where the most common approaches form the last years are implemented together, can help to improve existing approaches with new ideas or combinations.

In our toolkit are already a bunch of visualizations included. This can become quite handy while analyzing a new approach or while finding bugs.

And Who is this toolkit for?

Since Hand Redirection has no wide adoption and is mostly used in research so far, this toolkit mainly aims to help researchers developing and deploying hand redirection approaches. To use this toolkit, users should be familiar with the development in Unity and VR.

The Redirected Walking Toolkit

The idea to develop such a toolkit comes from the Redirected Walking Toolkit from Azmandian et al.. They created a similar toolkit, for Redirected Walking approaches in VR.
For more information, check out our their Toolkit and Paper