Abstract
In his wonderful essay from 1965, Ivan Sutherland described his vision of a future immersion into a computer-generated environment via novel types of multimodal input and output devices. He concludes with the following vision: “With appropriate programming such a display could literally be the Wonderland into which Alice walked.“ Unfortunately, even today real walking through virtual environment (VEs) is often not possible due to space constrains in the real world as well as the technological underdevelopments in this sector. However, redirected walking provides a solution to this problem by allowing users to walk through a large-scale immersive VE while physically remaining in a reasonably small workspace. Therefore, manipulations are applied to virtual camera motions so that the user’s self- motion in the virtual world differs from movements in the real world. Previous work found that the human perceptual system tolerates a certain amount of inconsistency between proprioceptive, vestibular and visual sensation in VEs, and even compensates for slight discrepancies with recalibrated motor commands. In this talk I will summarize the previous work on redirected walking and present the results of several experiments, which we performed to identify how much users can be tricked. Furthermore, we will see if such manipulations impose cognitive demands on the user, which may compete with other tasks in VEs for finite cognitive resources.
Prof. Dr. Frank Steinicke, Universität Hamburg
Ort: INB Seminarraum