HomeOpinionScientists explain how people understand each other

Scientists explain how people understand each other


During a conversation, the human brain “reflects” the processes occurring in the brain of the interlocutor, which relate to the words used. Communication at the level of brain activity was modeled by scientists from the USA.


According to previous studies, the success of social interaction is largely affected by behavioral synchrony. In particular, when musical groups are “tuned” to the movements of the soloist, the quality of the performance increases. It is known that during verbal communication, “synchronization” occurs at the level of brain activity. However, it was not clear until the end what this interaction depends on: whether directly on the words or on other components, such as body language or tone of voice.

Researchers from Princeton and New York universities (both in the USA) found that the simultaneous brain activity of two people during a conversation can be modeled by taking into account the words and the context in which they are placed. The journal published the results of the scientific study Neuron.

Using electrocorticography, scientists collected data on the brain activity of pairs of epileptic patients during natural conversations and also analyzed transcripts of these conversations. They found that word-specific activity peaked in the brain of the listener about 250 milliseconds before the speaker said the word and about 250 milliseconds after he heard it.

“We can see linguistic content emerge word by word before the speaker actually expresses what he or she is trying to say, and we see that the same linguistic content also emerges rapidly in the listener’s brain after he or she hears it explained,” the authors of the paper write.

The scientists noted that specific brain activity was related to both the words themselves and their meanings. For example, the English word cold can mean both a low fever, a personality trait, and a health condition (cold), depending on the context of the words.

The authors of the study used the GPT-2 neural network to extract the context of each word spoken in the conversation, and then trained a model on the data obtained that predicted changes in brain activity as information was transferred from the speaker to the listener during the conversation. The model’s answers turned out to be the most accurate compared to previous studies, which showed the important role of context.

In the future, the scientists plan to expand the research and apply the context-based approach model to other brain activity data, such as fMRI, which will allow us to understand how parts of the brain that are not accessible with electrocorticography work during speech.

Source: Port Altele

- Advertisement -

Worldwide News, Local News in London, Tips & Tricks

- Advertisement -