Entry Date:
January 17, 2012

Playtime Computing


The Playtime Computing System is a technological platform that computationally models a blended reality interactive and collaborative media experience that takes place both on-screen and in the real world as a continuous space. On-screen audio-visual media (e.g., portraying virtual environments and characters – story world, etc.) have an extended presence into the physical environment using digital projectors, robotics, real-time behavior capture, and tangible interfaces. Player behavior is tracked using 3D motion capture as well as other sensors such as cameras and audio inputs.

Physical objects can be instrumented or tracked so that they can serve as other tangible interfaces to affect the behavior of characters and objects both on screen and off screen. A digital paint interface is under development to allow players to add digital assets to the story world both on-screen and in the projected real-world space. These digital assets can be used to add interactivity, to author the world and its characters, or simply to embellish it aesthetically.

Characters in this system can seemingly transition smoothly from the physical world to the virtual on-screen world through a physical enclosure that metaphorically acts as a portal between the virtual and the real. Any events or changes that happen to the physical character in the real world are carried over to the virtual world. Digital assets can be transitioned from the virtual to the physical world. These blended reality characters can either be programmed to behave autonomously, or their behavior can be controlled by the players.

The Playtime Computing System supports co-creation of the blended reality story world through Creation Station workbenches and experiences that occur in the blended reality play space. The Creation Stations can either be co-located or remote from the blended reality play space. A live audio and video feed (or asynchronous image capture) from the blended reality play space environment is transmitted into the Creation Station workspace (via display or projection). Simultaneously, a camera captures and streams videos/images & audio from the Creation Station workspace into the blended reality play space. By doing so, local and remote players interact with each other and with physical and digital assets (either asynchronously or in real-time) at different spatial scales (room-scale in the play space, and desktop-scale at the creation station). Players can also create and share object-based media and digital assets between the blended reality play space environment and the Creation Station workspace. Multiple Creation Stations can be hooked into the blended reality play space. Furthermore, multiple blended reality play spaces can also be linked to create a “multi-chamber” play space that is geographically