social interaction

  • Live vs video interaction: sensorimotor and visual cortical oscillations during action observation

    The new format of online communication that emerged during the pandemic has firmly established itself in our lives, particularly in the fields of education and business. While we have gained more freedom and opportunities through this new medium, many people remain dissatisfied with its remote format. Most people believe that face-to-face communication allows for a better understanding and connection with the other person, as it allows the speaker to feel the energy and emotions of the audience, and vice versa. This is because in addition to verbal information, we also receive a stream of non-verbal cues, such as body language and tone of voice, which help us form a more complete picture of the other person's thoughts and feelings.

    It has been shown that the mirror neuron system and the motor and sensory-motor areas of the cortex play an important role in the perception of nonverbal visual information, including information about the movements, gestures, postures, and facial expressions of others. To investigate how the activity in these visual and sensory-motor regions of the cortex changes during video-based interactions, the authors of article, the researchers of IHNA & NPh RAS analyzed the electroencephalogram (EEG) response of 83 participants who watched identical actions shown live and through video recordings. The experiment was carefully designed to ensure that the images observed by the participants were as similar as possible, whether they were watching live or watching a video recording.

    To prevent the spread of the visual alpha-rhythm to the sensory-motor areas of the brain, the independent component analysis (ICA) method was used. This method allows us to separate the mu and alpha components. For the control task, participants were asked to watch a video of a non-biological object, a ball moving through a maze. A static demonstrator was shown as the baseline. Authors analyzed the mu-rhythm response in two frequency ranges: 8-13 Hz and 13-24 Hz. The authors found that the main mu-rhythm, which ranges from 8 to 13 Hz, is highly sensitive to biological and social movements and depends greatly on the format of interaction. Live demonstrations caused a significantly stronger mu-rhythm response in the sensory-motor areas of the brain. The alpha-rhythm, on the other hand, did not respond to the type of biological movement observed. However, live demonstrations initially caused a greater concentration of visual attention, which then decreased to the video format. At the same time, the higher range of the mu-rhythm was more sensitive to various gestures performed by the participants, which should be taken into consideration when developing neuro-interfaces.

    Therefore, it is important to understand that remote communication may affect both the concentration of visual attention and the subtle forms of social non-verbal communication related to the motor and sensory-motor areas of the brain.

  • Living videointeraction: sensomotor and visual cortical oscillations revealed under notification on the action

    The new format of online communications, which came to us during the pandemic, has become firmly entrenched in our lives, especially in the fields of education and business. On the one hand, we have received more freedom and opportunities, on the other hand, many are dissatisfied with the remote format of interaction. Most are inclined to believe that the live format of communication allows you to better understand and feel the interlocutor, and gives the speaker the opportunity to feel the audience, feedback from it and build a more emotional and memorable report. All this is due to the fact that in addition to the verbal information that people share when communicating, we read the flow of visual and auditory non-verbal information, which creates a complete picture of the interlocutor’s perception. It has been shown that in the perception of visual non-verbal information, an important role is played by the mirror system of the brain and the motor and sensorimotor areas of the cortex, where information about the movements, gestures, posture and facial expressions of the interlocutor is projected.