Users can control a robotic arm with their thoughts thanks to decoded brain signals

Users can control a robotic arm with their thoughts thanks to decoded brain signals ...

Researchers have developed a brain-machine interface (BMI) that can decode neural signals from the brain during arm movement.

BMI is a device that converts nerve signals into commands to control a machine, such as a computer or a robotic limb. There are two main techniques to monitor neural signals in BMIs: electroencephalography (EEG) and electrocorticography (ECoG).

The EEG is widely used because it is non-invasive, relatively inexpensive, safe, and easy to use. However, the EEG has minimal spatial resolution and detects irrelevant neural signals, which makes it difficult to interpret the intentions of individuals from the EEG.

The ECoG is a noninvasive technique that involves placing electrodes directly on the cerebral cortex beneath the scalp. Compared to the EEG, the ECoG is capable of tracking neural signals with significantly greater spatial resolution and less background noise. However, this technique has several limitations.

The ECoG is primarily used to discover potential causes of epileptic seizures, meaning the electrodes are placed in different locations for different patients, which may not be in the optimal areas of the brain for hearing sensory and movement signals. This is why Professor Jaeseung Jeong, a brain scientist at KAIST, has found that brain signals are difficult to decode.

To overcome these difficulties, the Professor Jeongs team developed a new algorithm for decoding ECoG neural signals during arm movement. The system is based on a machine-learning technique for measuring and predicting neural signals called an echo-state network and a mathematical probability model called the Gaussian distribution.

While conducting a reach-and-grasp task, the researchers recorded ECoG signals from four individuals with epilepsy. Because the ECoG electrodes were placed according to the potential sources of each patient epilepsy, only 22% to 44% of electrodes were located in the brain responsible for moving.

During the movement task, participants were given visual cues, either by placing a real tennis ball in front of them, or via a virtual reality headset with a clip of a human arm reaching forward in a first-person perspective. They were then asked to reach forward, grasp an object, then return their hand, and release the object, while wearing motion sensors on their wrists and fingers. In the second task, they were instructed to imagine reaching forward without moving their arms.

During real and imaginary arm movements, researchers measured the effects of the ECoG electrodes, and determined if the new system could detect the direction of this movement from the neural signals. They also discovered that the novel ECoG decoder successfully classified arm movements in 24 directions in three-dimensional space, both in the real and virtual tasks, and that the results were at least five times more accurate than chance.

Overall, the findings suggest that the new machine learning-based BCI system successfully used ECoG signals to identify the desired movements. The next steps will be to improve the accuracy and efficiency of the decoder. In the future, it may be used in a real-time BMI device to assist persons with movement or sensory impairments.

You may also like: