IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data

In a paper published in the journal Electronics, researchers have introduced an innovative approach to recognize negative emotions in mental healthcare by utilizing a multimodal biosignal data collection system.

This system includes Electroencephalogram (EEG) signals from a headset and physiological data from an intelligent band, processed via an Internet of Things (IoT) application and a machine learning server. The results indicate a high accuracy for recognizing emotions like disgust, fear, and sadness when utilizing only smart band data. This work sets the stage for developing a real-time metaverse system for expressing and detecting negative emotions.

Study: IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data. Image credit: Generated using DALL.E.3
Study: IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data. Image credit: Generated using DALL.E.3

Background 

The rise of wearable devices, like smart bands and smartwatches, driven by the demand for personal health and advancements in information technology, has fuelled interest in biosignal data for health monitoring and emotion recognition. While prior studies often relied on potentially influenced cues like voice or facial expressions, biosignals provide an objective and less manipulable means to detect surface and deeper negative emotions, including disgust, fear, and sadness.

Emotion Recognition and Management System

This IoT-based system empowers individuals to recognize and manage their negative emotions in daily life. It consists of three key components: an Android IoT application (APP), a oneM2M-compliant IoT server (IoT server MW), and a machine learning server. The process entails gathering diverse biosignal data from user-friendly wearable IoT devices like an EEG headset and an active band (Microsoft Band 2) and transmitting this data to the Android IoT APP. The IoT client APP communicates with the IoT server MW, creates containers, and feeds the biosignal data into a multiclass support vector machine (SVM) model with a nonlinear Radial Basis Function (RBF) kernel on the machine learning server to classify negative emotions.

Data Analysis, Correlations, and SVM Classification

Researchers conducted exploratory data analysis to explore the characteristics of the collected multimodal biosignal data related to three negative emotions: disgust, fear, and sadness. Histograms were employed to visualize data distribution within specific intervals, revealing critical insights into data characteristics, such as central tendency, asymmetries, and outliers. The analysis highlighted that the five EEG signals exhibited symmetrical distributions with single peaks. Heart Rate (HR), Galvanic Skin Response (GSR), and Skin Temperature (SKT) data displayed asymmetric distributions with multiple peaks, particularly HR and GSR, which showed right-skewed shapes.

Furthermore, researchers conducted correlation analyses using the Pearson correlation coefficient because it can express linear relationships without unit constraints. The results showcased the strength and nature of correlations between different variables in the context of each negative emotion. Notably, beta and gamma waves displayed strong positive correlations in the disgust state, while in the fear state, beta waves, gamma waves, and HR exhibited significant positive correlations. Beta waves, gamma waves, and SKT were positively correlated in the sadness state, suggesting their strong connection to this emotional state.

The multiclass nonlinear SVM model is a robust data classification tool. Unlike other models like decision trees, random forests, k-nearest neighbors, and artificial neural networks, the SVM excels in handling high-dimensional and large datasets, making it well-suited for classification tasks. In response to the necessity of taking nonlinear data distributions, this method selected a nonlinear SVM model with a Radial Basis Function (RBF) kernel, renowned for its effectiveness in addressing such scenarios.

The SVM model trains on an 80% training set of the multimodal biosignal dataset, normalizing the features within the [0, 1] range. It leverages a one-versus-rest method to address multiclass classification, creating multiple binary classification problems. The RBF kernel, used to map data into high-dimensional spaces, enables the model to handle diverse data types. It outlines the model with its hyperparameters and the RBF kernel's formula for calculating the influence of training vectors on classification.

Microsoft Band 2 offers a more comfortable alternative to the cumbersome equipment used in previous studies. Unlike traditional studies that classify emotions broadly, this approach focuses on achieving a finer classification. It effectively distinguishes between negative emotions such as disgust, fear, and sadness, providing a more detailed and user-friendly understanding.

Conclusion

In summary, this paper presents a negative emotion recognition system capable of classifying specific negative emotions. The system relies on an IoT server that continuously gathers multimodal biosignal data from wearable devices, an EEG headset, and an intelligent band, storing this data in real time. The researchers developed a multiclass SVM model with an RBF kernel to enhance emotion classification accuracy.

Comparative analysis between linear and nonlinear SVM models showed the superiority of the nonlinear RBF kernel models. Results indicated that using the band's HR, GSR, and SKT data achieved similar accuracy levels to multimodal data but with greater efficiency. Grid search identified optimal parameter values, achieving an impressive 98% average accuracy. Future research will aim to improve real-time emotion detection using smart bands and expand the scope to a broader range of emotions and subtle emotional states.

Journal reference:
Silpaja Chandrasekar

Written by

Silpaja Chandrasekar

Dr. Silpaja Chandrasekar has a Ph.D. in Computer Science from Anna University, Chennai. Her research expertise lies in analyzing traffic parameters under challenging environmental conditions. Additionally, she has gained valuable exposure to diverse research areas, such as detection, tracking, classification, medical image analysis, cancer cell detection, chemistry, and Hamiltonian walks.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chandrasekar, Silpaja. (2023, October 23). IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data. AZoAi. Retrieved on December 26, 2024 from https://www.azoai.com/news/20231023/IoT-Based-System-for-Recognizing-Negative-Emotions-Using-Multimodal-Biosignal-Data.aspx.

  • MLA

    Chandrasekar, Silpaja. "IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data". AZoAi. 26 December 2024. <https://www.azoai.com/news/20231023/IoT-Based-System-for-Recognizing-Negative-Emotions-Using-Multimodal-Biosignal-Data.aspx>.

  • Chicago

    Chandrasekar, Silpaja. "IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data". AZoAi. https://www.azoai.com/news/20231023/IoT-Based-System-for-Recognizing-Negative-Emotions-Using-Multimodal-Biosignal-Data.aspx. (accessed December 26, 2024).

  • Harvard

    Chandrasekar, Silpaja. 2023. IoT-Based System for Recognizing Negative Emotions Using Multimodal Biosignal Data. AZoAi, viewed 26 December 2024, https://www.azoai.com/news/20231023/IoT-Based-System-for-Recognizing-Negative-Emotions-Using-Multimodal-Biosignal-Data.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Novel Quantum Algorithm Redefines Computational Efficiency