Bestelmeyer PE, Jones BC, Debruine LM, Little AC, Perrett DI, Schneider A, Welling LL, Conway CA.
Many studies have used visual adaptation to investigate how recent experience with faces influences perception. While faces similar to those seen during adaptation phases are typically perceived as more 'normal' after adaptation, it is possible to induce aftereffects in one direction for one category (e.g. female) and simultaneously induce aftereffects in the opposite direction for another category (e.g. male). Such aftereffects could reflect 'category-contingent' adaptation of neurons selective for perceptual category (e.g. male or female) or 'structure-contingent' adaptation of lower-level neurons coding the physical characteristics of different face patterns. We compared these explanations by testing for simultaneous opposite after effects following adaptation to (a) two groups of faces from distinct sex categories (male and female) or (b) two groups of faces from the same sex category (female and hyper-female) where the structural differences between the female and hyper-female groups were mathematically identical to those between male and female groups. We were able to induce opposite aftereffects following adaptation between sex categories but not after adaptation within a sex category. These findings indicate the involvement of neurons coding perceptual category in sex-contingent face aftereffects and cannot be explained by neurons coding only the physical aspects of face patterns.
PMID: 17870064
Friday, September 28, 2007
Sex-contingent face aftereffects depend on perceptual category rather than structural encoding
Posted by Ali at 9:10 AM 0 comments
Tuesday, September 18, 2007
Dynamics of Visual Information Integration in the Brain for Categorizing Facial Expressions
Philippe G. Schyns, Lucy S. Petro, Marie L. Smith
Current Biology, Volume 17, Issue 18, 18 September 2007, Pages 1580-1585
A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15] and [16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with “fear” being faster than “disgust,” itself faster than “happy”). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.
Posted by Ali at 9:51 PM 0 comments