Neuroscience News logo for mobile.

Robotic arms directly connected to a partially paralyzed man’s brain allow him to feed himself

Overview: Using brain-machine interface technology and robotic arms, a paralyzed man was able to feed himself for the first time in 30 years.

Source: borders

Two robotic arms – a fork in one hand, a knife in the other – flank a seated man, who sits in front of a table, with a piece of cake on a plate. A computer voice announces each action: “Move fork to food” and “Retract knife”. Partly paralyzed, the man makes subtle movements with his right and left fists at certain prompts, such as ‘select the cut location’, so that the machine cuts a bite-sized piece. Now, “Bringing food to mouth,” and another subtle gesture to align the fork with its mouth.

In less than 90 seconds, a person with very limited upper-body mobility, who hasn’t been able to use his fingers in about 30 years, has just fed himself dessert with his mind and some clever robotic hands.

A team led by researchers from the Johns Hopkins Applied Physics Laboratory (APL), in Laurel, Maryland, and the Department of Physical Medicine and Rehabilitation (PMR) at the Johns Hopkins School of Medicine, has published a paper in the journal Frontiers in neurorobotics who described this latter feat using a brain-machine interface (BMI) and a pair of modular prosthetic limbs.

Also called a brain computer interface, BMI systems provide a direct communication link between the brain and a computer, which decodes and “translates” neural signals to perform various external functions, from moving a cursor on a screen to enjoying now a cookie. In this particular experiment, muscle movement signals from the brain helped control the robotic prosthetics.

A new approach

The study built on more than 15 years of research in neural science, robotics and software led by APL in collaboration with the Department of PMR, as part of the Revolutionizing Prosthetics Program, originally sponsored by the US Defense Advanced Research Project Agency. (DARPA). The new paper outlines an innovative shared control model that allows a human to maneuver a pair of robotic prostheses with minimal mental input.

“This shared control approach aims to harness the intrinsic capabilities of the brain machine interface and robotic system, creating a ‘best of both worlds’ environment in which the user can personalize the behavior of a smart prosthesis,” said Dr. . Francesco Tenore, a senior project manager in APL’s Research and Exploratory Development Department. The paper’s senior author, Tenore, focuses on neural interface and applied neuroscience research.

“While our results are preliminary, we are excited to give users with limited capabilities a real sense of control over increasingly intelligent assistive machines,” he added.

Helping people with disabilities

One of the key robotics advancements demonstrated in the article is the combination of robotic autonomy with limited human input, where the machine does most of the work and allows the user to modify the robot’s behavior to their liking. dr. David Handelman, the author of the article. lead author and senior roboticist in the Intelligent Systems Division of APL’s Research and Exploratory Development Division.

“In order for robots to perform human-like tasks for people with reduced functionality, they need human agility. Human agility requires complex control of a complex robotic skeleton,” he explained.

This shows a brain
In this particular experiment, muscle movement signals from the brain helped control the robotic prosthetics. Image is in the public domain

“Our goal is to make it easy for the user to control the few things that are most important for specific tasks.”

dr. Pablo Celnik, the project’s principal investigator in the PMR Department, said: “The human-machine interaction demonstrated in this project indicates the potential capabilities that can be developed to help people with disabilities.”

Closing the loop

Although the DARPA program officially ended in August 2020, the team from APL and the Johns Hopkins School of Medicine continues to work with colleagues from other institutions to demonstrate and explore the technology’s potential.

The next iteration of the system could integrate previous research showing that sensory stimulation to amputees allowed them to not only perceive their phantom limb, but also use muscle movement signals from the brain to control a prosthesis.

The theory is that the addition of sensory feedback delivered directly to a person’s brain can help him or her perform certain tasks without the constant visual feedback in the current experiment.

“This research is a great example of this philosophy, where we knew we had all the tools to demonstrate this complex bimanual activity of everyday life that non-disabled people take for granted,” Tenore said.

“Many challenges lie ahead, including improved task execution, both in terms of accuracy and timing, and closed-loop control without the constant need for visual feedback.”

Also see

This shows the hands of an elderly lady

Celnik added: “Future research will explore the boundaries of these interactions, even beyond the basic activities of daily living.”

About this news about neurotech and robotics research

Author: press office
Source: borders
Contact: Press Agency – Borders
Image: The image is in the public domain

Original research: Open access.
Shared control of bimanual robot limbs with a self-feeding BMIby Francesco Tenore et al. Frontiers in neurorobotics


Abstract

Shared control of bimanual robot limbs with a self-feeding BMI

Advances in intelligent robotic systems and brain-machine interfaces (BMI) have helped restore functionality and independence to individuals living with sensorimotor deficits; however, tasks requiring bimanual coordination and fine manipulation remain unresolved, given the technical complexity of controlling multiple degrees of freedom (DOF) across multiple limbs via a user input in a coordinated manner.

To address this challenge, we implemented a collaborative, shared control strategy to manipulate and coordinate two modular prosthetic limbs (MPL) to perform a bimanual self-feeding task.

A human participant with microelectrode arrays in sensorimotor brain regions gave commands to both MPLs to perform the self-feeding task, including bimanual cutting. Motor commands were decoded from bilateral neural signals to control up to two DOFs on each MPL simultaneously. The shared control strategy allowed the participant to assign their four-DOF control inputs, two per hand, to as many as 12 DOFs for specifying the position and orientation of the robot end effector.

Using neurally directed shared control, the participant successfully and simultaneously controlled movements of both robotic limbs to cut and eat food in a complex two-manual self-feeding task.

This demonstration of bimanual robot system control through a BMI in conjunction with intelligent robotic behavior has major implications for restoring complex movement behaviors for people with sensorimotor disorders.

Leave a Comment

Your email address will not be published. Required fields are marked *