AI's Deep Impact: Exploring Facial Expressions in Job Interviews

In an article published in the journal Scientific Reports, researchers from Switzerland, India, and France used artificial intelligence (AI) to generate videos of people showing different facial expressions and tested how these expressions influenced the perception of observers in a job interview setting. Moreover, they indicated the benefits of using AI-created videos for studying nonverbal behavior, as it allowed for a high level of standardization and control over the experimental material.

Study: AI
Study: AI's Deep Impact: Exploring Facial Expressions in Job Interviews. Image credit: Jacob Lund/Shutterstock

Background

AI-generated media has potential across different domains, including entertainment, art, education, and marketing. Deepfake is an example of AI-based technology that leverages machine learning algorithms and enables users to generate synthetic media, such as images or videos, of people or events that did not happen. It is utilized to manipulate a person's facial appearance or behavior in a video, such as swapping faces, changing expressions, or lip movement adjustment.

While deepfake can be used for malicious purposes, such as spreading fake news, creating controversial or misleading content, and impersonating celebrities, it also presents opportunities for beneficial applications, particularly in experimental and psychological research. Researchers leverage deepfake to create standardized and realistic video stimuli for studying how facial expressions influence observer perceptions. This methodology facilitates manipulating and examining human behaviors in controlled settings, offering insights into social dynamics and cognitive processes.

About the Research

In the present paper, the authors aimed to explore how deepfake can overcome some methodological challenges in studying the effect of facial expressiveness on first impression judgments in a job interview context using deepfake-generated videos. Facial expressiveness refers to nonverbal behaviors like gazing, nodding, and smiling, which show engagement and closeness in interactions. Previous research has shown that facial expressiveness can lead to positive outcomes like being liked, trusted, and seen as hireable, affecting how people are judged in areas like education, health, and hiring.

However, traditional methods of studying facial expressiveness, like recording actors or participants, have their issues. They include differences in expressive behaviors, difficulties controlling how often they happen, and other factors like gender or age influencing perceptions. To overcome these problems, the authors employed deepfake to create videos of the same people called "targets," with either expressive or non-expressive faces, while they answered interview questions.

The study used a one-shot deepfake technique, where a single still image is animated with facial movements from another source. The researchers collected selfies from 159 students in Switzerland and India as input for the deep fake algorithm. They also used a video of an actor displaying the desired facial expressions as reference input. The result was videos showing the targets with either expressive or non-expressive faces.

The authors then asked 823 observers, who were recruited online, to watch the target videos and to rate the targets on warmth, competence, and overall favorable impression. The observers also rated the realness of the videos to check the quality of the deepfake-generated material.

Research Findings

The outcomes showed that facial expressiveness positively impacts interview outcomes, as supported by AI analysis. Targets who displayed more facial expressiveness through actions like nodding, gazing, and smiling were viewed as more competent, warmer, and more favorable compared to those who exhibited less expressiveness, such as looking away, not nodding, or not smiling. This effect remained consistent across different target genders and cultures, indicating its robustness. Moreover, the videos created through deepfake technology were perceived as realistic, indicating their effectiveness in generating naturalistic experimental material.

Additionally, the authors investigated how factors like facial expressiveness, target gender, and target culture interacted to influence outcomes. Among targets who showed less expressiveness, Indian female targets were perceived as more competent than both Indian male targets and Swiss female targets. Furthermore, non-expressive Indian targets were seen as warmer and left a more favorable impression compared to non-expressive Swiss targets.

Applications

The paper effectively illustrates the potential of utilizing deepfake technology for investigating nonverbal behavior and its impact on social outcomes. Through the use of AI-generated videos, authors were able to manipulate the characteristics of individuals displaying behaviors, as well as standardize factors such as timing, intensity, frequency, and co-occurrence of these behaviors in a controlled manner. This approach enables a more rigorous and precise evaluation of the causal relationship between nonverbal behavior and social outcomes, enhancing understanding of underlying mechanisms and influencing factors.

The findings suggest that job interview training should not only focus on verbal communication but also emphasize the importance of nonverbal cues, such as listening attentively and reacting appropriately. These nonverbal behaviors signify engagement and closeness, contributing to the formation of favorable impressions.

Conclusion

In summary, the research demonstrates the feasibility of using deepfake technology to produce videos depicting various facial expressions and assess their effects on observers' perceptions in job interview scenarios. The study replicates and supports previous findings showing that facial expressiveness is related to more favorable first impressions. It also shows the benefits of using AI-generated videos for the study of nonverbal behavior, as it allows for a high level of standardization and control over the experimental material.

The researchers acknowledged limitations and challenges and suggested directions for future research. They suggested utilizing deepfakes to manipulate other nonverbal cues like voice, posture, or gestures, and analyzing their impact on different social outcomes like persuasion, trust, or cooperation. Furthermore, further studies could utilize deepfakes to create more dynamic and authentic scenarios, such as group interactions or cross-cultural exchanges, to delve deeper into the impact of nonverbal behavior in various contexts.

Journal reference:
Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, February 18). AI's Deep Impact: Exploring Facial Expressions in Job Interviews. AZoAi. Retrieved on November 22, 2024 from https://www.azoai.com/news/20240218/AIs-Deep-Impact-Exploring-Facial-Expressions-in-Job-Interviews.aspx.

  • MLA

    Osama, Muhammad. "AI's Deep Impact: Exploring Facial Expressions in Job Interviews". AZoAi. 22 November 2024. <https://www.azoai.com/news/20240218/AIs-Deep-Impact-Exploring-Facial-Expressions-in-Job-Interviews.aspx>.

  • Chicago

    Osama, Muhammad. "AI's Deep Impact: Exploring Facial Expressions in Job Interviews". AZoAi. https://www.azoai.com/news/20240218/AIs-Deep-Impact-Exploring-Facial-Expressions-in-Job-Interviews.aspx. (accessed November 22, 2024).

  • Harvard

    Osama, Muhammad. 2024. AI's Deep Impact: Exploring Facial Expressions in Job Interviews. AZoAi, viewed 22 November 2024, https://www.azoai.com/news/20240218/AIs-Deep-Impact-Exploring-Facial-Expressions-in-Job-Interviews.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Boost Machine Learning Trust With HEX's Human-in-the-Loop Explainability