Scientists have discovered that conversations do more than exchange words—they synchronize brain activity. By combining AI with neuroimaging, this study shows how emotional content and language structure shape our neural responses, unlocking new insights into human interaction.
Schematic summary of the experimental procedure. (A) Setup of devices and sitting arrangement of dyads during the experimental sessions. (B) Image from Azhari et al. (2019). Schematic diagram depicting the position of 20 functional near-infrared spectroscopy (fNIRS) channels and their corresponding positions to measure the activity of the superior frontal gyrus (SFG), middle frontal gyrus (MFG), inferior frontal gyrus (IFG), and anterior prefrontal cortex (aPFC). (C) Emotional content of sentences computed using EmoAtlas (Semeraro et al., 2025). (D) Syntactic parsing of sentences. (E) Representation of the syntactic/semantic structure of sentences using textual forma mentis networks (Stella et al., 2019).
Bringing research from the lab to the home, from a controlled environment to real life. A way to understand human interaction. With the continuous evolution of technology, its potential grows, driving both scientific exploration and real-world applications. In this sense, the authors of this study have made a step forward to understand what happens at the brain level when two people come into contact and interact with each other, such as during a conversation, when giving each other a gift, or in other situations of cooperation.
The methodology and results are described in an article titled "Emotional Content and Semantic Structure of Dialogues are associated with Interpersonal Neural Synchrony in the Prefrontal Cortex," which was recently published in the scientific journal NeuroImage.
The paper was authored by Alessandro Carollo and Gianluca Esposito (corresponding authors), Massimo Stella and Andrea Bizzego of the University of Trento (Department of Psychology and Cognitive Science) as part of an international collaboration with Mengyu Lim of the Nanyang Technological University of Singapore.
Their work has shed new light on the association between how people communicate, in terms of emotions and language, and their brain activity.
"For the first time, we have combined AI techniques to neuroimaging measurements obtained on two people at the same time. We have worked in a laboratory setting, but we tried to create less controlled situations than usual, so that each participating couple was free to invent a dialogue as well as to imagine giving each other a gift and being surprised to receive it," says Alessandro Carollo, first author of the study.
The research, conducted in the Department of Psychology and Cognitive Science laboratories at the University of Trento in Rovereto, involved 42 pairs of participants (84 individuals) aged between 18 and 35.
"We combined artificial intelligence techniques with the most advanced brain imaging technology to study how emotions and the structure of language influence brain activity in interactions. This study reveals that, when two people interact, their brain activity is synchronized, especially in the prefrontal cortex. Emotional content and the structure of language are connected to this neural synchrony," explains Gianluca Esposito.
The dialogues were transcribed by hand, and artificial intelligence techniques were used to encode the transcriptions and obtain emotional and syntactic/semantic indexes of the conversations.
Functional near-infrared spectroscopy (fNIRS) was used for neuroimaging measurements.
This technique is similar to an electroencephalogram but less invasive than magnetic resonance imaging and other methods. It can record the dynamics of hemoglobin, the molecule that carries oxygen in the blood, in different brain areas. A light source emits beams of photons, and a photodetector is placed on a helmet. The amount of light absorbed by hemoglobin is measured, evaluating brain activity.
Alessandro Carollo explains, "It is an easy-to-carry and lightweight technique. It only requires a small box containing a pair of caps and their cables. Then, you plug it into a laptop computer, and that is all you need to study human interactions."
He continues: "The goal is to bring research from the lab to the home, from the controlled environment to real life, where people are free to talk to each other and interact."
The contribution of the research team is promising.
Gianluca Esposito states: "The best approach seems to be the transdisciplinary one, which integrates emotional content and semantic/syntactic information. The results obtained on neuronal synchronization have a number of interesting implications.
The study shows that emotions and language structure influence our conversations and the neural processes that then guide how we interact with each other. This opens up new avenues for research into human interactions. We think of interactions between parent and child, between partners, friends, or simply two strangers who find themselves interacting by chance."
"Emotional Content and Semantic Structure of Dialogues are associated with Interpersonal Neural Synchrony in the Prefrontal Cortex" has been published by the open-access scientific journal NeuroImage.
The corresponding authors are Alessandro Carollo (who is also the first author) and Gianluca Esposito. The other authors are Massimo Stella, Andrea Bizzego of UniTrento, and Mengyu Lim of the Nanyang Technological University of Singapore.
Source:
Journal reference:
- Carollo, A., Stella, M., Lim, M., Bizzego, A., & Esposito, G. (2025). Emotional content and semantic structure of dialogues are associated with Interpersonal Neural Synchrony in the Prefrontal Cortex. NeuroImage, 309, 121087. DOI: 10.1016/j.neuroimage.2025.121087, https://www.sciencedirect.com/science/article/pii/S1053811925000898