Robot Control with Eye Artifacts Using Brain‑Computer Interface for Assistive Applications

In an article recently published in the journal Scientific Reports, researchers demonstrated the feasibility of using brain-computer Interface (BCI) robot control with eye artifacts based on a novel thresholding-based pattern recognition algorithm for people with disabilities.

Study: Robot Control with Eye Artifacts Using Brain computer Interface for Assistive Applications. Image credit: pogonici/Shutterstock
Study: Robot Control with Eye Artifacts Using Brain Computer Interface for Assistive Applications. Image credit: pogonici/Shutterstock

Background

Robots are increasingly playing a crucial role in patient care, specifically for individuals with disabilities. Individuals with neurodegenerative disorders do not voluntarily or consciously produce movements excluding those that involve the eyelids or eyes.

Thus, people with such disorders require new human-robot interaction methods, such as BCI, as exoskeletons and/or prostheses are ineffective for them. BCI is primarily a non-muscular communication channel that allows an individual to send messages and commands to an automated system such as a robot through their brain activity.

Thus, the BCI technology is crucial for individuals with disabilities as it provides an effective solution to enable communication with assistive technologies. Eye blinks and movements are common problems in electroencephalogram (EEG)-based BCI research. Few studies have exploited eye artifacts for the control and communication of machines.

Eye artifact-related research has primarily focused on blink detection, with studies using EEG signals ensuring that the participants do not move their eyes. Studies focusing on the detection of eye movement are rare and mostly use electrooculogram (EOG) sensors in place of EEG.

Additionally, eye artifacts are observable in the time domain unlike the useful EEG signals, and possess a higher signal-to-noise ratio (SNR). The electrodes positioned around the eyes in EOG-based methods can also adversely affect the eyesight of an individual and do not generate any additional information compared to an EEG cap, which are the major disadvantages of EOG.

The proposed approach

In this study, researchers proposed a real-time BCI to control an assistive robot with the eye artifacts of a user in a human-robot collaborative scenario to improve the quality of life of individuals with disabilities by enabling them to interact freely with the environment. Eye artifacts that contaminate the EEG signals were used as a crucial source of information in this research due to their intentional generation and high signal-to-noise ratio.

A novel thresholding-based pattern recognition algorithm was developed by researchers to detect eye artifacts through characteristic shapes of the EEG signals. F7, F8, and FP1 channels were used in the detection algorithm, with FP1 being used in blink detection, while the F7 and F8 channels were in lateral movement detection.

The lateral/right and left movements were detected by their ordered peak and valley pattern and phase difference/the opposite behavior of F7 and F8 channels. The double-thresholding method was used for blink detection to capture both regular and weak blinks.

This double-thresholding method differed from the other algorithms in the literature that used only a single threshold. Moreover, real-time detected events with their virtual time stamps were fed into a second algorithm/distributor to distinguish between quadruple and double blinks from the single-blink occurrence frequency.

EEG sensor TMSi SAGA 64+ was employed to record signals from the patient’s prefrontal cortex. The detection algorithm detected the blinks and lateral eye movements that occurred voluntarily or forced on the frontal cortex. This information was then used to control a graphical user interface with several functions, including the TIAGo assistive robot.

Experimental evaluation and findings

A 64-bit Windows laptop was employed to record the EEG data for offline analysis, and a gel-type EEG cap was in all experiments where its impedance values were maintained under five kΩ.

The algorithm was initially evaluated by performing real-time and offline analyses and then implemented on the BCI. Subsequently, the created BCI with online blinks/eyes direction detection was utilized to control a real assistive robot TIAGo, through a graphical user interface. Five human subjects, three males and two females aged 27 ± 3 years, participated in experiments controlling a robot platform to validate the developed BCI.

The proposed algorithm was evaluated offline to assess the performance of the look left, look right, and blink detection. Six different dated sets of data obtained through a series of experimental tests performed on the same subject were investigated to determine the offline detection performance.

The algorithm effectively detected eye artifacts, including blinks and lateral movements. It displayed 100% and 78.8% highest and lowest offline blink detection accuracy, 100% and 84.6%, the highest and lowest offline look left detection accuracy, and 98% and 77.8%, the highest and lowest look right offline detection accuracy, respectively.

In real-time tests, participants were allowed to perform eye artifacts by their choices, and the accuracy of both real-time detection and offline detection was compared. The real-time and offline performance demonstrated different results, with the real-time detection accuracy being slightly lower than the offline detection accuracy for blink detection and significantly lower than the offline detection accuracy for look left and look right detection.

However, the performance of the proposed detection algorithm was satisfactory in both offline and real-time analyses for implementation in BCI. Eventually, the detection algorithm was implemented to communicate with the TIAGo assistive robot through a graphical user interface to realize successful robot control and validate the BCI.

All participants controlled the base movements and selected a pre-defined task for the robot to perform using their eye artifacts in the BCI validation experiment, indicating the effective performance of the proposed BCI robot control with eye artifacts. The participants successfully tested all functions of the robot to validate the feasibility of using the entire BCI system.

Journal reference:
Samudrapom Dam

Written by

Samudrapom Dam

Samudrapom Dam is a freelance scientific and business writer based in Kolkata, India. He has been writing articles related to business and scientific topics for more than one and a half years. He has extensive experience in writing about advanced technologies, information technology, machinery, metals and metal products, clean technologies, finance and banking, automotive, household products, and the aerospace industry. He is passionate about the latest developments in advanced technologies, the ways these developments can be implemented in a real-world situation, and how these developments can positively impact common people.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dam, Samudrapom. (2023, October 19). Robot Control with Eye Artifacts Using Brain‑Computer Interface for Assistive Applications. AZoAi. Retrieved on November 23, 2024 from https://www.azoai.com/news/20231019/Robot-Control-with-Eye-Artifacts-Using-Braine28091computer-Interface-for-Assistive-Applications.aspx.

  • MLA

    Dam, Samudrapom. "Robot Control with Eye Artifacts Using Brain‑Computer Interface for Assistive Applications". AZoAi. 23 November 2024. <https://www.azoai.com/news/20231019/Robot-Control-with-Eye-Artifacts-Using-Braine28091computer-Interface-for-Assistive-Applications.aspx>.

  • Chicago

    Dam, Samudrapom. "Robot Control with Eye Artifacts Using Brain‑Computer Interface for Assistive Applications". AZoAi. https://www.azoai.com/news/20231019/Robot-Control-with-Eye-Artifacts-Using-Braine28091computer-Interface-for-Assistive-Applications.aspx. (accessed November 23, 2024).

  • Harvard

    Dam, Samudrapom. 2023. Robot Control with Eye Artifacts Using Brain‑Computer Interface for Assistive Applications. AZoAi, viewed 23 November 2024, https://www.azoai.com/news/20231019/Robot-Control-with-Eye-Artifacts-Using-Braine28091computer-Interface-for-Assistive-Applications.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
OpenAI Simplifies and Scales Continuous-Time Consistency Models for Faster AI Generation