Entry Date:
January 26, 2010

DoppelLab: Tools for Exploring and Harnessing Multimodal Sensor Network Data


Homes and offices are being filled with sensor networks to answer specific queries and solve pre-determined problems, but there exist no comprehensive visualization tools for fusing these disparate data to examine relationships across spaces and sensing modalities. DoppelLab is an immersive, cross-reality virtual environment that serves as an active repository of the multimodal sensor data produced by a building and its inhabitants. We transform architectural models into browsing environments for real-time sensor data visualizations, as well as open-ended platforms for building audiovisual applications atop those data. These applications in turn become sensor-driven interfaces to physical world actuation and control. As a visuospatial repository designed to enable rapid parsing, visualization, sonification, and application development, DoppelLab proposes to organize these data by the space from which they originate and thereby provide a platform to make both broad and specific queries about the activities, systems, and relationships in a complex, sensor-rich environment.