VicTour, Virtual Interactive Character Tour Guide

VicTour is YLabs ‘s final proof-of-concept resulting from its latest software research project CHAMELEON. During the development process the project generated other examples already published such as the “mother of all depth sensing demos”, long before the whole Kinect frenzy (http://www.youtube.com/watch?v=qXcIZ1…). This depth sensing demo was distinguished with the first Auggie award (the Augmented Reality Oscars) at the Augmented Reality Event 2010.
CHAMELEON emerged from the strong belief that the research and development of next-generation intelligent interaction devices is expected to rely on integrative efforts across several research fields. The main objective of this project was therefore to explore and implement architectures and practical design methodologies for embodied intelligent interaction, with a focus on affective and cognitive computation models, as well as autonomous adaptation and learning of system components and parameters over dynamic multi-modal and multi-user environments. More specifically, research focused on:
• Situated cognition and human action models to support development of natural interaction systems;
• Embodied agent architectures suitable to the scalable integration of learning and affective computing algorithms;
• Unsupervised, reinforcement and evolutionary learning techniques for autonomous agents in the context of multi-modal natural interaction installations;
• Affective computing techniques for the design of emotion-based agents capable of interacting with users in a natural and believable way;
• Data mining techniques capable of analyzing and extracting high-level interaction trends and user features to assist in the self-adaptation of interactive systems

VicTour was created with the intention of validating the CHAMELEON project ‘s entire set of achievements . It inherited some of YVision´s previously developed augmented reality functionalities like depth extraction with 2D cameras, collision detection, dynamic occlusions, etc, and materialized into this intelligent and empathetic virtual character that now lives in YDreams’ showroom, and is partly responsible for presenting it to our visitors.
This application encompasses many of the actual computational trends like affective computing, distributed computation, augmented reality, ambient intelligence or smart surroundings, and opens up a large spectrum of possible commercial applications that may be explored in the near future.

YDreams’ Augmented Reality experience with depth-sensing camera

YDreams augmented-reality platform supports depth-sensing cameras. These types of cameras make it possible to pinpoint the 3D position of the users (it supports multiple users) and objects in the environment, allowing for a true marker less augmented reality experience. The virtual objects added in the video stream react to the user as well as the real objects´ positions and movements.

Unlike similar applications that use 2D cameras, this one is not sensitive to light changes, camera movements and noisy backgrounds. This is a simple demo to exemplify the user experience.