Show simple item record

dc.contributor.authorKask, Annika
dc.contributor.authorPõldver, Nele
dc.contributor.authorAusmees, Liisi
dc.contributor.authorKreegipuu, Kairi
dc.date.accessioned2021-05-10T19:25:09Z
dc.date.available2021-05-10T19:25:09Z
dc.date.issued2021
dc.identifier.urihttps://datadoi.ee/handle/33/340
dc.identifier.urihttps://doi.org/10.23673/re-282
dc.description.abstractThe present study investigates how the brain automatically discriminates emotional schematic faces, as indicated by the mismatch responses, and how reliable these brain responses are. Thirty-three healthy volunteers participated in the vMMN EEG experiment with four experimental sets differing from each other by the type of standard (object with scrambled face features) and the type of deviants (Angry, Happy and Neutral schematic faces) presented. Conscious subjective evaluations of valence, arousal and attention catching of the same stimuli showed clear differentiation of emotional expressions. Deviant faces elicited rather similar vMMN at frontal and occipital sites. Bayesian analyses suggest that vMMN does not differ between angry and happy faces. Neutral faces, however, did not yield statistically significant vMMN at occipital leads. Pearson’s correlation and intra-class correlation analyses showed that the brain’s reactions to the stimuli were highly stable within individuals across the experimental sets, whereas the mismatch responses were much more variable.en
dc.relationPRG770en
dc.rightsinfo:eu-repo/semantics/openAccessen
dc.subjectvisual mismatch response, vMMN, automatic visual processing, affective processing, schematic faces, reliabilityen
dc.titleSubjectively different emotional schematic faces not automatically discriminated from the brain’s bioelectrical responsesen
dc.typeinfo:eu-repo/semantics/dataset


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record