In an article recently published in the journal Scientific Reports, researchers demonstrated the feasibility of using brain-computer Interface (BCI) robot control with eye artifacts based on a novel thresholding-based pattern recognition algorithm for people with disabilities.
Background
Robots are increasingly playing a crucial role in patient care, specifically for individuals with disabilities. Individuals with neurodegenerative disorders do not voluntarily or consciously produce movements excluding those that involve the eyelids or eyes.
Thus, people with such disorders require new human-robot interaction methods, such as BCI, as exoskeletons and/or prostheses are ineffective for them. BCI is primarily a non-muscular communication channel that allows an individual to send messages and commands to an automated system such as a robot through their brain activity.
Thus, the BCI technology is crucial for individuals with disabilities as it provides an effective solution to enable communication with assistive technologies. Eye blinks and movements are common problems in electroencephalogram (EEG)-based BCI research. Few studies have exploited eye artifacts for the control and communication of machines.
Eye artifact-related research has primarily focused on blink detection, with studies using EEG signals ensuring that the participants do not move their eyes. Studies focusing on the detection of eye movement are rare and mostly use electrooculogram (EOG) sensors in place of EEG.
Additionally, eye artifacts are observable in the time domain unlike the useful EEG signals, and possess a higher signal-to-noise ratio (SNR). The electrodes positioned around the eyes in EOG-based methods can also adversely affect the eyesight of an individual and do not generate any additional information compared to an EEG cap, which are the major disadvantages of EOG.
The proposed approach
In this study, researchers proposed a real-time BCI to control an assistive robot with the eye artifacts of a user in a human-robot collaborative scenario to improve the quality of life of individuals with disabilities by enabling them to interact freely with the environment. Eye artifacts that contaminate the EEG signals were used as a crucial source of information in this research due to their intentional generation and high signal-to-noise ratio.
A novel thresholding-based pattern recognition algorithm was developed by researchers to detect eye artifacts through characteristic shapes of the EEG signals. F7, F8, and FP1 channels were used in the detection algorithm, with FP1 being used in blink detection, while the F7 and F8 channels were in lateral movement detection.
The lateral/right and left movements were detected by their ordered peak and valley pattern and phase difference/the opposite behavior of F7 and F8 channels. The double-thresholding method was used for blink detection to capture both regular and weak blinks.
This double-thresholding method differed from the other algorithms in the literature that used only a single threshold. Moreover, real-time detected events with their virtual time stamps were fed into a second algorithm/distributor to distinguish between quadruple and double blinks from the single-blink occurrence frequency.
EEG sensor TMSi SAGA 64+ was employed to record signals from the patient’s prefrontal cortex. The detection algorithm detected the blinks and lateral eye movements that occurred voluntarily or forced on the frontal cortex. This information was then used to control a graphical user interface with several functions, including the TIAGo assistive robot.
Experimental evaluation and findings
A 64-bit Windows laptop was employed to record the EEG data for offline analysis, and a gel-type EEG cap was in all experiments where its impedance values were maintained under five kΩ.
The algorithm was initially evaluated by performing real-time and offline analyses and then implemented on the BCI. Subsequently, the created BCI with online blinks/eyes direction detection was utilized to control a real assistive robot TIAGo, through a graphical user interface. Five human subjects, three males and two females aged 27 ± 3 years, participated in experiments controlling a robot platform to validate the developed BCI.
The proposed algorithm was evaluated offline to assess the performance of the look left, look right, and blink detection. Six different dated sets of data obtained through a series of experimental tests performed on the same subject were investigated to determine the offline detection performance.
The algorithm effectively detected eye artifacts, including blinks and lateral movements. It displayed 100% and 78.8% highest and lowest offline blink detection accuracy, 100% and 84.6%, the highest and lowest offline look left detection accuracy, and 98% and 77.8%, the highest and lowest look right offline detection accuracy, respectively.
In real-time tests, participants were allowed to perform eye artifacts by their choices, and the accuracy of both real-time detection and offline detection was compared. The real-time and offline performance demonstrated different results, with the real-time detection accuracy being slightly lower than the offline detection accuracy for blink detection and significantly lower than the offline detection accuracy for look left and look right detection.
However, the performance of the proposed detection algorithm was satisfactory in both offline and real-time analyses for implementation in BCI. Eventually, the detection algorithm was implemented to communicate with the TIAGo assistive robot through a graphical user interface to realize successful robot control and validate the BCI.
All participants controlled the base movements and selected a pre-defined task for the robot to perform using their eye artifacts in the BCI validation experiment, indicating the effective performance of the proposed BCI robot control with eye artifacts. The participants successfully tested all functions of the robot to validate the feasibility of using the entire BCI system.
Journal reference:
- Karas, K., Pozzi, L., Pedrocchi, A., Braghin, F., & Roveda, L. (2023). Brain-computer interface for robot control with eye artifacts for assistive applications. Scientific Reports, 13(1), 1-16. https://doi.org/10.1038/s41598-023-44645-y, https://www.nature.com/articles/s41598-023-44645-y