Entry Date:
March 17, 2005

GroupMedia: Socially-Intelligent Wearables

Principal Investigator Alex 'Sandy' Pentland


GroupMedia is a set of tools and applications to enable social context awareness and quantitative intelligence on pervasive cell phones and PDAs. We believe that by building quantitative models of human behavior and social interaction, we can devise next-generation social software for these devices. Our current research focuses on using speech features, body language, and physiology analysis to understand conversational interest, movie audience reactions, speed-dating, risk-taking in games, and distance-separated group dynamics. Such real-time models could increase productivity of organizations and take mobile social networking to the next level.

Cellphones are soon expected to become the most popular consumer device on the planet. About half of the 800 million cell phones shipped in 2005 were more powerful than Pentium 1 computers. By quantifying the behavior of cell phone users it now seems possible to predict answers to questions like whom they got along with, what movie they enjoyed, how well they spoke, or even what product they might buy.

This real-time information could be used for feedback and training, to customise experiences and interactions with machines, take images or annotate conversations, or even connect friends and colleagues with appropriate privacy restrictions

The GroupMedia project evolved from work at the Wearable Computing Group, a.k.a. Borglab, driven by the need for more perceptual socially-aware applications for cell phones and PDAs. We measure speech speaking styles (speech feature processing), head-nodding, body motion (accelerometry) and physiology (galvanic skin response) to understand interest in conversations, effectiveness of elevator pitches, movie audience reactions, speed-dating, focus groups, and group interaction dynamics.