Embodiment Meets Environment: Toward Context-Aware, Safe Physical Caregiving Robots

Cornell University

RSS 2026


Abstract

Physical caregiving robots need to assist different users with different tasks in diverse environments, and they come in many embodiments. While substantial progress has been made on individual caregiving tasks, most existing systems remain tightly coupled to specific environments and robot embodiments, and often do not explicitly model or constrain interactions around people, despite humans being special agents in the environment. This motivates a focus on adapting to context that emerges from the joint interaction between the environment and the robot's embodiment.

We propose E2CARE, a framework that enables context-aware adaptation by representing primitive caregiving skills as interaction templates whose execution is reshaped online. E2CARE represents the environment, the robot, and the human within a unified 3D dynamic scene graph that models these interaction contexts explicitly, and synthesizes task-specific constraints to govern how each skill is executed. By enforcing these constraints at runtime, the same skill templates can be reused zero-shot and safely across diverse environments and robot embodiments. We evaluate E2CARE across four activities of daily living in hundreds of simulated household environments, including assistive home settings, and across diverse robot embodiments, and validate it through user studies on two caregiving tasks with two robots in various real-world environments. Results demonstrate consistent and successful adaptation across these environments and embodiments.




BibTex

@inproceedings{wu2026embodiment,
  title  = {Embodiment Meets Environment: Toward Context-Aware, Safe Physical Caregiving Robots},
  author = {Wu, Zhanxin and Tong, Ruofei and Fang, Jiaying and Bhattacharjee, Tapomayukh},
  booktitle = {Robotics: Science and Systems (RSS)},
  year={2026}
}