Abstract
Humans and other primate species are experts at recognizing affective information from body movements but the underlying brain mechanisms are still largely unknown. Previous research focusing on the brain representation of symbolic emotion categories has led to mixed results. This study used representational similarity and multi-voxel pattern analysis techniques to investigate how postural and kinematic features computed from affective whole-body movement videos are related to brain processes. We show that body posture and kinematics differentially activated brain regions indicating that this information might be selectively encoded in these regions. Most specifically, the feature limb contraction seemed to be particularly relevant for distinguishing fear and it was represented in several regions spanning affective, action observation and motor preparation networks. Our approach goes beyond traditional methods of mapping symbolic emotion categories to brain activation/deactivation by discovering which specific movement features are encoded in the brain, and possibly drive automatic emotion perception.