Emotion Recognition: Insights from Natural Body Motion Using Machine Learning

Emotion recognition plays a pivotal role in human-machine interactions and interpersonal communication. However, prior studies often focused on limited body expressions and artificial emotions. In a recent paper published in the journal PLOS ONE, researchers investigated genuine emotions, natural movements, and a comprehensive range of motion parameters. They conducted a lab experiment involving 24 participants, inducing emotions such as happiness, relaxation, fear, sadness, and emotional neutrality through pretested films. Further, machine learning models were employed to classify emotions.

Study: Emotion Recognition: Insights from Natural Body Motion Using Machine Learning. Image credit: Ole.CNX/Shutterstock
Study: Emotion Recognition: Insights from Natural Body Motion Using Machine Learning. Image credit: Ole.CNX/Shutterstock

Background

Recognizing emotions is crucial for effective communication and human-machine interactions. Emotions encompass various modalities, including verbalization, voice modulation, facial expressions, and body language. This study investigates the relationship between emotions and body motion, emphasizing the significance of body cues in emotion recognition, particularly when other cues are unavailable or incongruent. The research presents an approach to emotion recognition based on naturalistic body motion in a controlled lab experiment that simulates real-world human-robot interactions.

Previous studies primarily used traditional statistics to explore the connection between emotion and motion, focusing on parameters like postural control during standing and walking, but these studies yielded limited significant effects. Therefore, researchers employed multiple machine learning models to classify emotions.

Key Experimental Details and Data Analysis

Experimental Design and Sample: The study involved 24 participants, with an average age of 25.06 years, selected from a student subject pool. These participants were healthy individuals without injuries or movement impairments. The sample size allowed for detecting medium- to large effect sizes. Ethical approval was granted by the Institutional Review Board, and participants provided written informed consent.

Emotion Manipulation: Emotional states, encompassing valence and arousal dimensions, were induced using movies. Participants experienced happiness (positive, high arousal), relaxation (positive, low arousal), fear (negative, high arousal), sadness (negative, low arousal), and emotionally neutral intervals. The selection of these movies was based on pretests involving 80 participants (20 in each condition), ensuring effective emotion elicitation.

Procedure: The study employed a disguised strategy to prevent hypothesis-guessing bias and conscious emotional expression. Participants were informed that the research aimed to explore the relationship between physical and cognitive factors. The experiment occurred in an environment simulating a home setting, emphasizing gait movement between stations A and B, separated by 4.5 meters.

Motion Measures: Body motion data were captured using a motion capture system and a force plate, segmented into standing, gait initiation, and walking phases. A total of 229 motion parameters were analyzed and categorized into balance, standing posture, gait initiation, and walking. Data processing included methods to address weight shifts during standing and transition phases in posture parameters to eliminate potential biases.

Statistical Testing: Due to the non-normal distribution of many parameters and the small sample size, nonparametric permutation testing was employed. The analysis included randomized repeated measures analysis of variance (ANOVA) and pairwise comparisons between emotions. Gender differences were explored using a two-way ANOVA, with significance established at a p-value less than 0.05.

Machine Learning: Several models were tested, including k-nearest neighbors, decision trees, logistic regression, and support vector machines with various functions. Model evaluation metrics included accuracy, recall, precision, and F1 score, with accuracy reflecting the proportion of correctly classified data points.

Results and Analysis

Manipulation Check: The manipulation check involved comparing scores for the intended emotion to scores for other emotions across movie segments. The results revealed significant differences in all cases. This additional manipulation check, consistent with the pretest, confirmed the successful elicitation of intended emotions by the movies.

Effect of Emotion on Motion Parameters: Out of the 229 motion parameters examined, only 7 exhibited significant effects in separate statistical analyses. Six parameters displayed significant effects of emotion on motion, while one parameter revealed a significant moderating role related to gender.

These parameters include the standard deviation (SD) of the center of posture (COP) movement in the mediolateral axis during standing, the SD of standing shoulder in the sagittal plane, the SD of standing shoulder in the frontal plane, the SD of standing back angle, the SD of standing left wrist to hip distance, and the SD of standing right wrist to hip distance.

These parameters showcased substantial differences, primarily between one or more emotional conditions and the neutral condition. Particularly, the SD of right leg swing duration during walking exhibited a gender moderation effect, with an increased SD observed in female participants under the happy condition.

Machine Learning Models: Machine learning models were applied to evaluate emotion recognition based on motion parameters. Support vector machines with radial basis function kernels (RBF-SVM) and decision tree models achieved the highest accuracy rate at 45.83%. Due to its interpretability, the decision tree model was explored further. Out of 229 features, 25 with leaf node depths 10 and 27 are utilized for emotion classification. Feature importance was assessed using Gini index values, prioritizing specific classes after splitting along particular measures.

Conclusion

In summary, researchers emphasized the limitations of relying on individual motion parameters for emotion recognition and highlighted the potential of machine learning models that consider multiple parameters. Future research should focus on refining these models for greater accuracy. Moreover, emerging technologies such as Microsoft Kinect and pose estimation tools may have practical applications for emotion recognition in home settings. Additionally, exploring these effects among the elderly population could provide insights into the development of elder-assisting devices.

Journal reference:
Dr. Sampath Lonka

Written by

Dr. Sampath Lonka

Dr. Sampath Lonka is a scientific writer based in Bangalore, India, with a strong academic background in Mathematics and extensive experience in content writing. He has a Ph.D. in Mathematics from the University of Hyderabad and is deeply passionate about teaching, writing, and research. Sampath enjoys teaching Mathematics, Statistics, and AI to both undergraduate and postgraduate students. What sets him apart is his unique approach to teaching Mathematics through programming, making the subject more engaging and practical for students.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Lonka, Sampath. (2023, September 15). Emotion Recognition: Insights from Natural Body Motion Using Machine Learning. AZoAi. Retrieved on November 21, 2024 from https://www.azoai.com/news/20230915/Emotion-Recognition-Insights-from-Natural-Body-Motion-Using-Machine-Learning.aspx.

  • MLA

    Lonka, Sampath. "Emotion Recognition: Insights from Natural Body Motion Using Machine Learning". AZoAi. 21 November 2024. <https://www.azoai.com/news/20230915/Emotion-Recognition-Insights-from-Natural-Body-Motion-Using-Machine-Learning.aspx>.

  • Chicago

    Lonka, Sampath. "Emotion Recognition: Insights from Natural Body Motion Using Machine Learning". AZoAi. https://www.azoai.com/news/20230915/Emotion-Recognition-Insights-from-Natural-Body-Motion-Using-Machine-Learning.aspx. (accessed November 21, 2024).

  • Harvard

    Lonka, Sampath. 2023. Emotion Recognition: Insights from Natural Body Motion Using Machine Learning. AZoAi, viewed 21 November 2024, https://www.azoai.com/news/20230915/Emotion-Recognition-Insights-from-Natural-Body-Motion-Using-Machine-Learning.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Machine Learning Identifies Seismic Precursors, Advancing Earthquake Forecasting Capabilities