A new paper published in the IEEE Systems, Man, and Cybernetics Magazine, describes the results of several experiments conducted with a set of state-of-the-art machine learning algorithms for the purposes of detecting and monitoring the mental workload and affective states of a human brain.
In brain-computer interfaces, or BCIs – classified as either active (used for controlling devices by brain activity alone) or passive (used for monitoring the mental state or emotions of a user) – brain signals are typically measured by electroencephalography (EEG).
The problem, however, is that raw EEG signals are difficult to organise into specific, meaningful patterns, and currently available systems do not have sufficiently advanced digital processing algorithms to make passive BCIs functional.
“The low accuracy is due to extremely high complexity of a human brain. The brain is like a huge orchestra with thousands of musical instruments from which we wish to extract specific sounds of each individual instrument using a limited number of microphones or other sensors,” said co-author Andrzej Cichocki.
In the study, Cichocki and colleagues looked at two groups of machine learning algorithms, Riemannian geometry based classifiers (RGCs) and convolutional neural networks (CNNs), which have previously been found to be quite effective in active BCIs. In total, the researchers experimented with seven algorithms, two of which they’d designed themselves.
The first experiment was conducted by training the algorithms on the EEG data of a specific individual and then later testing them on that same individual. The second experiment, on the other hand, was subject-independent and therefore significantly more challenging due to variations in the subjects’ brain waves.
Results showed that an artificial deep neural network is more effective at workload estimation, but underperforms in classifying emotions. In contrast, the two Riemannian algorithms – modified by the researchers for the study – did quite well in both tasks.
According to the authors, their findings indicate that passive BCIs are more useful for workload estimation, but not as powerful when it comes to detecting and monitoring affective states. Furthermore, more research is required to improve subject-independent calibration, which currently leads to fairly low accuracy levels.
“In the next steps, we plan to use more sophisticated artificial intelligence (AI) methods, especially deep learning, which allow us to detect very tiny changes in brain signals or brain patterns. Deep neural networks can be trained on the basis of a large set of data for many subjects in different scenarios and under different conditions. AI is a real revolution and is also potentially useful for BCI and recognition of human emotions,” Cichocki said.