Cornell researchers used computer vision, machine learning, and multimodal sensing to develop a robotic feeding system that safely feeds people with severe mobility limitations, including those with spinal cord injury, cerebral palsy, and multiple sclerosis.
“Robot-feeding individuals with severe mobility limitations is difficult,” said Tapomayukh “Tapo” Bhattacharjee, assistant professor of computer science at Cornell Ann S. Bowers College. “Many are unable to lean forward and must place food directly into their mouths. “Because,” he said. He is a senior developer in computing and information sciences. “The challenges are further compounded when feeding individuals with additional complex medical conditions.”
A paper on the system, “Feel the Bite: Robot-Assisted Intra-Mouth Bite Transfer Using Robust Mouth Recognition and Physical Interaction Aware Control,” was presented at the Human Robot Interaction conference held March 11-14 in Boulder, Colorado. It has been announced. . This product received the Best Paper Honorable Mention award, and the research team’s extensive demonstration of a robotic feeding system received the Best Demo Award.
Bhattacharjee, a leader in the field of assistive robotics, and his EmPRISE lab have spent years teaching machines the complex process of how humans feed themselves. Teaching machines to identify and pick up food on a plate and move it into the mouth of a care recipient is a complex task.
“The last five centimeters from the instrument to the inside of the mouth are very difficult,” Bhattacharjee said.
Some care recipients may have very limited mouth opening range of less than 2 cm, while others experience involuntary muscle spasms that can occur unexpectedly even when the device is in their mouth, Bhattacharjee said. Additionally, some people can only bite food in certain places in their mouths, which they demonstrate by using their tongue to push the utensil.
“Current technology only looks at a person’s face once and assumes they will remain still, which is often not the case and can be very limiting for the care recipient,” said Rajat Kumar Jenamani, lead author of the paper and a doctoral student in the field. Computer Science.
To address these challenges, researchers are developing robots with two essential features: real-time mouth tracking that adapts to the user’s movements and dynamic response mechanisms that allow the robot to sense and react to the characteristics of the physical interactions that occur. Equipped. appropriately. This allows the system to distinguish between sudden jerks, intentional biting, and attempts by the user to manipulate instruments inside the mouth, the researchers said.
The robotic system successfully fed 13 individuals with a variety of medical conditions in a user study across three locations: the EmPRISE Lab on the Cornell Ithaca campus, a medical center in New York City, and a care recipient’s home in Connecticut. Researchers said robot users thought the robots were safe and comfortable.
“This is one of the most extensive real-world evaluations of an autonomous robot-assisted feeding system targeting end users,” Bhattacharjee said.
The team’s robot is an articulated arm that holds a custom tool on its end that can sense applied forces. Trained on thousands of images featuring head poses and facial expressions of different participants, the mouth tracking method combines data from two cameras positioned above and below the apparatus. This allows accurate detection of the mouth and overcomes visual obstacles caused by the device itself, the researchers said. These physical interaction recognition response mechanisms use both vision and force sensing to recognize how the user is interacting with the robot, Jenamani said.
“We’re enabling individuals to control a 20-pound robot with just their tongue,” he said.
He cited user research as the most satisfying aspect of the project, noting the significant emotional impact the robots have on care recipients and caregivers. In one session, parents of a daughter with tetraplegia, a rare birth defect called tetraencephalopathy, witnessed her daughter successfully feed herself using the system.
“It was a really touching moment. Her father raised his hat in celebration and her mother was almost in tears,” Jenamani said.
Additional work is needed to investigate the long-term usefulness of the system, but the promising results highlight its potential to improve the level of independence and quality of life of care recipients, the researchers said.
“It’s really amazing and I’m very satisfied,” Bhattacharjee said.
Co-authors on the paper include: Daniel Stabile, MS ’23; Ziang Liu, a doctoral student in computer science; Abrar Anwar of the University of Southern California and Katherine Dimitropoulou of Columbia University.
This research was primarily funded by the National Science Foundation.