Personal Robots Group: Social Robots for Long-Term Interaction
Dr. Hae Won Park is a research scientist in the Personal Robots Group at MIT Media Lab. Hae Won develops interactive social machines that deeply personalize to their users through long-term interaction, attending to the unique needs and goals of each individual.
Hae Won Park has always been fascinated with robots. She cites well known examples from our collective sci-fi media experiences—R2-D2, C-3PO, even Rosie from the Jetsons—as inspirations for her dreams of robot companions. But when she was an undergraduate student at POSTECH, Korea, her research was geared towards the sensing and mechanics of robots. In other words, how robots see, move and balance. By focusing on computer vision, electronics, and locomotion control, Hae Won endowed humanoid robots with the ability to, among other things, walk with humans while adjusting their course to avoid obstacles.
Not long ago, it was fantasy to imagine robots that could enrich our lives due to their social intelligence. According to Hae Won, the groundbreaking work of Personal Robots Group founder and Director Cynthia Breazeal was a key motivating factor for her own shift to the realm of social robotics and human-robot interaction. “I was inspired by Cynthia Breazeal’s pioneering research on social robotics, including the work that led to Kismet and Leonardo. The reason we build robots is to help people—we want to empower people and improve their lives. In order for robots to work closely with us and share our space, they require socio-emotive intelligence.”
The reason we build robots is to help people—we want to empower people and improve their lives. In order for robots to work closely with us and share our space, they require socio-emotive intelligence.
Today, Hae Won works in the MIT Media Lab collaborating with Professor Breazeal, leading the research efforts in artificial intelligence for long-term personalization, embodied emotional intelligence, and applications to education, healthcare, and eldercare. The systems she develops are deployed and tested in the real world with everyday users in high societal impact areas that support learning, emotional and social well-being as well as behavior changes and decision making. “We’re deeply invested in real-world deployment at Personal Robots Group,” says Hae Won. “Industry comes to us for solutions because we’re always focused on designing technologies that provide the ultimate end-user experience.”
When we talk to others, our verbal and non-verbal cues play an essential role in our everyday interactions. We signal engagement in a conversation in a variety of ways (e.g., eye contact, head movement, facial expressions, hand and body gestures). The same sentence with different sentiments or voice intonations can have drastically different meanings. For Hae Won, analyzing, understanding, and transmitting that knowledge along the interaction context is one of the key factors influencing the rapid growth of social robotics.
It also explains the importance of having a robot with a physical presence, as opposed to just a voice agent. “If there is no embodiment in front of you, it is less likely that you will use expressive facial cues or even express sentiment in your voice. In that regard, this is a very new and exciting type of technology that lets us investigate a new data domain that is all about how humans express nonverbal cues, understanding those cues, and how we use them to personalize robots to an individual over time,” says Hae Won.
Much of Hae Won’s recent work has been applied to the field of education. She points out that engagement is a key aspect of the learning process for children. Under her guidance, the robot learns to make inferences about a particular child’s mental state and uses that information to choose an appropriate action. “Over time, when children develop a peer-to-peer working alliance with a robot, they start outputting different cues,” she says. “We’re interested in how to ensure our robots adapt to those cues and, over time, from interaction to interaction, use that information to provide appropriate learning materials for each individual child.”
Tega is a furry, interactive desktop robot built to assist children with language learning and building positive learning attitudes. Hae Won spearheaded Tega’s interaction intelligence design and field deployment.
Tega is a furry, interactive desktop robot built to assist children with language learning and building positive learning attitudes. Hae Won spearheaded Tega’s interaction intelligence design and field deployment. Since 2016, under Hae Won’s research supervision, Tega has interacted with over 250 children in Greater Boston, working as a co-learning peer-to-children assistant and as a classroom assistant. Tega’s deployment focused on areas with high bilingual and English learner populations as well as low socio-economic status families.
Tega has the ability to sense and respond to a child’s engagement states and learning progressions, which function as a reward signal that inform Tega’s actions, allowing the robot to adapt to the child. In essence, Hae Won’s affective reinforcement learning algorithm guides the robot’s behavior to better attract each child’s attention and improve learning over time. The platform is bright and furry to appeal to children, has a battery that lasts for up to 6 hours, and an Android-based smartphone that functions as the display and the main controller.
By 2023, the social robotics industry is forecasted to increase to more than half a billion dollars, driven largely by the growing demands of what is referred to as the aging-in-place market (i.e., the elderly choosing to live in the residence of their preference), which is expected to reach 98 million people in the U.S. by 2060. Hae Won and her colleagues at Personal Robots Group are at the forefront of exploring elder-care applications for their social robot technologies.
“I’m particularly interested in agents that can support elder care, especially to support older adults to live independently as long as possible,” says Hae Won. But it’s not just the U.S. looking to social robotics to assist the elderly. Hae Won mentions Korea, Japan, and many European countries that are facing issues surrounding the healthcare of an aging population and regard social robotics as a promising assistive technology.
Hae Won Park wants to deploy robots on a long-term basis—that means robots at home, robots in schools and in hospitals, always on, interacting with us, enriching our lives on a daily basis. “I think personalization, adapting a robot to a user’s wants and needs over time, and also adapting from one interaction to the next while learning from prior interactions, these are the keys to unlocking long-term beneficial human-robot relationships.”