Human-Robot Commensality

Being able to eat independently with friends and family is considered one of the most memorable and important activities for people with mobility limitations. Robots can potentially help with this activity but robot-assisted feeding is a multi-faceted problem with challenges in bite acquisition, bite timing, and bite transfer. Bite timing in particular becomes uniquely challenging in social dining scenarios due to the possibility of interrupting a social human-robot group interaction during commensality. Our key insight is that bite timing strategies that take into account the delicate balance of social cues can lead to seamless interactions during robot-assisted feeding in a social dining scenario. We approach this problem by collecting a multimodal Human-Human Commensality Dataset (HHCD) containing 30 groups of three people eating together. We use this dataset to analyze human-human commensality behaviors and develop bite timing prediction models in social dining scenarios. We also transfer these models to human-robot commensality scenarios. Our user studies show that prediction improves when our algorithm uses multimodal social signaling cues between diners to model bite timing.

Robot-assisted Feeding

Eating is an activity of daily living (ADL) and losing the ability to self-feed can be devastating. Robots have the potential to help with these tasks. Successful robotic assistive feeding depends on reliable bite acquisition, appropriate bite timing, and easy bite transfer in both individual and social dining settings. Automating bite acquisition is daunting as the universe of foods, cutlery, and human strategies is massive and the activity demands robust nonprehensile manipulation of a deformable hard-to-model target. Bite timing, especially in social dining settings, is a delicate dance of multimodal signaling (via gaze, facial expressions, gestures, and speech, to name a few), action, and sometimes coercion. Bite transfer constitutes a unique type of robot-human handover where the human needs to use the mouth. Through this project, we are developing algorithms and technologies that can address these challenges towards a robotic system that can autonomously feed people with upper-extremity mobility limitations in real homes.

Compliant Manipulation with Whole-arm Sensing

Physical interactions between robots and humans are inevitable and desired during caregiving. Tactile sensing can enable a robot to infer properties of its surroundings from planned and incidental contact during manipulation in unstructured human environments. Through this project, we are developing solutions that can efficiently combine multimodal sensing and perception to develop intelligent planning and control policies for safe and efficient manipulation with and around humans.

Featured Videos