RCareWorld

We present RCareWorld, a human-centric simulation world for physical and social robotic caregiving designed with inputs from stakeholders, such as care-recipients, caregivers, occupational therapists, and roboticists. RCareWorld has realistic human models of care recipients with mobility limitations and caregivers, home environments with multiple levels of accessibility and assistive devices, and robots commonly used for caregiving. It interfaces with various physics engines to model diverse material types necessary for simulating caregiving scenarios, and provides the capability to plan, control, and learn both human and robot control policies by integrating with state-of-the-art external planning and learning libraries, and VR devices. We propose a set of realistic caregiving tasks in RCareWorld as a benchmark for physical robotic caregiving and provide baseline control policies for them. We illustrate the high-fidelity simulation capabilities of RCareWorld by 1) demonstrating the execution of a policy learnt in simulation for one of these tasks on a real-world robot setup and 2) receiving positive feedback from clinical stakeholders on the realism of the modeled human avatars and assistive environments for caregiving activities. Additionally, we perform a real-world social robotic caregiving experiment using behaviors modeled in RCareWorld. Robotic caregiving, though potentially impactful towards enhancing the quality of life of care recipients and caregivers, is a field with many barriers to entry due to its interdisciplinary facets. Through this project, we are taking the first step towards building a realistic simulation world for robotic caregiving (RCareWorld) that would enable researchers worldwide to contribute to this impactful field.

SPARCS

We propose Structuring Physically Assistive Robotics for Caregiving with Stakeholders-in-the-loop (SPARCS) to address these challenges. SPARCS is a framework for physical robot caregiving comprising (i) Building Blocks, models that define physical robot caregiving scenarios, (ii) Structured Workflows, hierarchical workflows that enable us to answer the Whats and Hows of physical robot caregiving, and (iii) SPARCS-Box, a web-based platform to facilitate dialogue between all stakeholders. We collect clinical data for six care recipients with varying disabilities and demonstrate the use of SPARCS in designing well-defined caregiving scenarios and identifying their care requirements. All the data and workflows are available on SPARCS-Box. We demonstrate the utility of SPARCS in building a robot-assisted feeding system for one of the care recipients. We also perform experiments to show the adaptability of this system to different caregiving scenarios. Finally, we identify open challenges in physical robot caregiving by consulting care recipients and caregivers.

Human-Robot Commensality

Being able to eat independently with friends and family is considered one of the most memorable and important activities for people with mobility limitations. Robots can potentially help with this activity but robot-assisted feeding is a multi-faceted problem with challenges in bite acquisition, bite timing, and bite transfer. Bite timing in particular becomes uniquely challenging in social dining scenarios due to the possibility of interrupting a social human-robot group interaction during commensality. Through this project we are developing bite timing strategies that take into account the delicate balance of social cues can lead to seamless interactions during robot-assisted feeding in a social dining scenario.


Robot-assisted Feeding

Eating is an activity of daily living (ADL) and losing the ability to self-feed can be devastating. Robots have the potential to help with these tasks. Successful robotic assistive feeding depends on reliable bite acquisition, appropriate bite timing, and easy bite transfer in both individual and social dining settings. Automating bite acquisition is daunting as the universe of foods, cutlery, and human strategies is massive and the activity demands robust nonprehensile manipulation of a deformable hard-to-model target. Bite timing, especially in social dining settings, is a delicate dance of multimodal signaling (via gaze, facial expressions, gestures, and speech, to name a few), action, and sometimes coercion. Bite transfer constitutes a unique type of robot-human handover where the human needs to use the mouth. Through this project, we are developing algorithms and technologies that can address these challenges towards a robotic system that can autonomously feed people with upper-extremity mobility limitations in real homes.


Compliant Manipulation with Whole-arm Sensing

Physical interactions between robots and humans are inevitable and desired during caregiving. Tactile sensing can enable a robot to infer properties of its surroundings from planned and incidental contact during manipulation in unstructured human environments. Through this project, we are developing solutions that can efficiently combine multimodal sensing and perception to develop intelligent planning and control policies for safe and efficient manipulation with and around humans.


Featured Videos