第31回 談話会 (2008年2月28日/第8回 若手の会談話会)
The N170 controversy:
Is there an ERP wave responding specifically to the human face?
Guillaume Thierry 氏(University of Wales, School of Psychology)
Establishing when and how the human brain differentiates between object categories is a key to understanding visual cognition. In particular, it has been shown that the human face triggers specific behavioural and neural response patterns compared to other objects. One widely acknowledged index of face processing is the N170, a negative event-related potential (ERP) peaking 170 ms after stimulus presentation. However, a potential methodological confound might be present in a number of studies reporting the N170 as face-selective. Indeed, faces are often presented from a full front view while other stimuli are generally more perceptually variable across different experimental trials in terms of size, orientation, eccentricity, etc. In a recent study (Thierry et al., 2007), we manipulated inter-stimulus perceptual variance (ISPV, low / high) and object category (faces / cars) in a fully randomized two-by-two design. Unexpectedly, the N170 was not sensitive to object category but to ISPV. In addition, the first positive peak after stimulus onset, the P1, was modulated by object category but not IPSV, with no interaction. We then replicated these results using side-views of faces and butterflies. The N170 was modulated by ISPV but not category and the P1 displayed a category-selective behaviour but no sensitivity to ISPV. These results call into question the face-selectivity of the N170 component. Moreover, they show early ERP category effects 70 ms before the traditionally reported time range. These findings have triggered a virulent reaction in ERP face experts (Bentin et al., 2007; Rossion and Jacques, in press). I will be reviewing some of the key arguments around which the dispute is ongoing and I will discuss wider methodological implication of this "dispute".
Integrating speech and spontaneous hand gesture in comprehension
Sotaro Kita 氏(University of Birmingham, School of Psychology)
In everyday conversation, speech is often accompanied by spontaneous hand gestures. We effortlessly integrate information from the two sources to form a unified interpretation. My presentation summarizes two studies that investigated the time course of this integration process. One study involved event related potentials (ERPs) generated by the brain, and the other involved behavioral experiments with the "gating" paradigm. The results indicated that comprehension processes for speech and hand gesture proceeded parallelly, without one waiting for the other. And, the two processes integrated their outputs incrementally, without waiting for the complete final outputs.
日時 | 2008年2月28日(木)17:00〜 |
---|---|
場所 | 大学8号館(工学部)第2会議室 |