Emotional Facial Expression Processing of Emoticon: an ERP Study Taejin Park (tpark@jnu.ac.kr) Department of Psychology, Chonnam National University Gwangju 500-757, Republic of Korea Abstract design and stimuli. Previous studies examining the effect of attention during This study examined the processing of emotional facial expression of emoticon compared to those of human. With emotional facial perception on ERPs have shown that pictures of emoticon faces and human faces expressing happy, correlates of facial expression processing are modulated by angry, fearful, and neutral emotions along with pictures of spatial attention (Pessoa et al., 2002; Eimer et al., 2003; houses and scrambled faces, 24 participants were required to Holmes et al., 2003). Facial expression effects were do location-judgment about two gaps on contour of facial and eliminated when attention was directed away from the control stimuli (non-attention) or were required to do location of peripherally presented emotional faces, pleasantness-judgment about the stimuli (focused attention). indicating that facial expressions are not processed Face-specific N170 to ignored facial expressions showed preattentively (Eimer et al., 2003). But, when faces emotion effect only for fearful expression of human face and presented within foveal vision were unattended, early showed no emotion effect for all emoticon facial expressions at both hemispheres. But, to attended facial expressions, emotional expression effects in the 160-220 ms post- N170 showed emotion effect for all emoticon expressions as stimulus interval were still preserved and were eliminated well as for all human expressions at right hemisphere, and beyond 220 ms post-stimulus (Holmes, Kiss, & Eimer, showed limited emotion effect for emoticon and human 2006). These results demonstrate that when faces are expressions at left hemisphere. These results suggest that presented foveally, the initial rapid stage of emotional attended facial expressions of emoticon are processed in a expression processing is unaffected by attention. But, using similar way as those of human, but, ignored facial attentional blink procedure, a recent study demonstrates that expressions of emoticon can hardly be processed, suggesting amplitude of N170 is dependent on attentional resources the processing of emotional facial expression of emoticon even when faces are presented within foveal vision (Luo et depends on more attentional resources than those of human face. al., 2010). The controversy about attentional dependency of facial emotional processing is still unresolved. Keywords: facial expression; emoticon; attention; N170 Facial expressions of emoticon have been widely used as substitutes for those of human. But, whether the facial Introduction expression processing of emoticon is similar to those of Human facial stimuli convey important emotional human or not remains unknown. information in social exchange. Recordings at the scalp have The aim of this study was to examine electro- demonstrated ERP components reflecting face-specific physiological correlates of emotional expressions of responses peaking at around 170 ms post-stimulus at lateral emoticon face and compare them to those of human face. occipito-temporal electrodes (Bentin et al., 1996). N170 has Another aim of this study was to examine attentional shown substantial specificity for faces, typically dependency of facial expression processing of emoticon and demonstrating a smaller or absent N170 response for non- compare them to those of human. Specifically, I was face stimuli (Itier & Taylor, 2004). N170 component clearly interested in whether the emotion effect observable for the distinguishes faces from non-face visual stimuli and is N170 elicited by emoticon would be distinguishable from therefore considered to be an index of the configural those elicited by human and whether the emotion effect of processing of the face (Bentin et al., 1996; Rossion et al., emoticon face would be modulated by selective attention to 2003; Itier & Taylor, 2004). Source localization studies the same extent as those of human face. have localized the generators of the N170 to the fusiform gyrus which has been termed the “fusiform face area” Method (Kanwisher, McDermott, & Chun, 1997; Itier and Taylor, Participants. As paid individuals, 24 undergraduates from 2004). Chonnam National University participated in the experiment. There is conflicting evidence regarding whether N170 is All participants in this study were healthy, right-handed responsive to emotional expression. Some researchers have individuals with normal or corrected-to-normal vision. They found that N170 does not discriminate emotional expression gave a written informed consent for participating in the (Eimer, Holmes, & McGlone, 2003), while others have study. The study was approved by the ethics committee at found that expression modulates N170 amplitude (Batty & Chonnam National University. Taylor, Blau et al., 2007; Luo et al., 2010; Williams et al, 2006), showing a larger amplitude for fearful relative to Stimuli and Procedure. Pictures of emoticon faces and neutral faces (Batty & Taylor, 2003). These discrepancies in human faces expressing happy, angry, fearful, and neutral experimental findings might be related to differences in emotions along with pictures of houses and scrambled faces 538 were used (Fig. 1). Each of them had contours with two gaps on left and right side which were located at different height. Participants were required to judge which side of gaps was located higher (non-attention) by pressing one of two keys, or were required to judge the pleasantness of facial stimuli (focused attention) by pressing one of three keys (pleasant, neutral, unpleasant). ERP recording and analysis. ERPs were recorded from 40 scalp electrodes according to the international 10-20 system. Horizontal and vertical EOGs were recorded for EOG artifact correction. The impedance for all electrodes was kept below 5 kΩ. Presentation of stimuli was controlled via a PC running E-Prime software. EEG was sampled at 250 Hz with the vertex electrode as the online reference. Offline, the continuous EEG record was segmented into epochs of 1000 ms, starting 200 ms prior to stimulus onset, and transformed to average reference, and referred to a 200 ms prestimulus baseline. Face-specific N170 ERP component at inferior occipito-temporal sites (PO7 at left hemisphere and PO8 at right hemisphere) was analyzed. Fig. 1. Experimental stimuli (except human faces) Results ignored facial expressions (non-attention condition), When comparing two control pictures (house and amplitude of N170 showed significant emotion effect for scrambled face) with neutral faces of emoticon and human, fearful expression of human face (enhanced amplitude to the amplitudes of N170 in response to emoticon face and fearful expression rather than neutral expression), but human face were significantly larger than those to two showed no emotion effect for all facial expressions of control stimuli at both of non-attention condition and emoticon at both hemispheres. These results suggest that the focused attention condition, suggesting that N170 is processing of fearful expression of human face is unaffected sensitive to emoticon face as well as human face. by attention, but, on the other hand, all emotional To elucidate the emotion effect of facial expressions, expressions of emoticon cannot be processed without amplitudes of N170 in response to emotional expressions enough attentional resource. were compared to those to neutral expression. In response to In response to attended facial expressions (focused Fig. 2. Grand average ERPs at focused attention condition Fig. 3. Grand average ERPs at non-attention condition 539 Fig. 4. Average amplitudes (μV) of N170 (E, Emoticon; R, Real human face; A, Angry; F, Fear; J, Joy; N, Neutral) (Error bars indicate standard error) attention condition), amplitude of N170 showed significant Blau, V. C., Maurer, U., Tottenham, N., & McCandliss, B. emotion effect for all emoticon expressions as well as D. (2007). The face-specific N170 component is human expressions at right hemisphere, and showed modulated by emotional facial expression. Behavioral significant emotion effect for angry emoticon face and and Brain Functions, 3:7 doi:10.1186/1744-9081-3-7. fearful human face at left hemisphere, suggesting right Eimer, M., Holmes, A., & McGlone, F., (2003). The role of hemisphere dominance for emotional face processing. ERP spatial attention in the processing of facial expression: an waveforms for fearful and neutral expressions of emoticon ERP study of rapid brain responses to six basic emotions. and human face were shown in and . Cognitive Affective, & Behavioral Neuroscience, 3, 97- Average amplitudes of all expressions of emoticon and 110. human face were shown in . Holmes, A., Kiss, M., & Eimer, M. (2006). Attention These results suggest that facial expressions of emoticon modulates the processing of emotional expression are processed in a similar way as those of human when they triggered by foveal faces. Neuroscience Letters, 394, 48- are fully attended. But, when emoticons are ignored, their 52. facial expressions can hardly be processed even in case Holmes, A., Vuilleumier, P., & Eimer, M., (2003). The where they are presented within foveal vision, suggesting processing of emotional facial expression is gated by that the processing of facial expression of emoticon spatial attention: evidence from event-related brain demands more attentional resources than those of human. potentials. Cognitive Brain Research, 16, 174-184. Itier, R. J., & Taylor, M. J. (2004). N170 or N1? Acknowledgments Spatiotemporal differences between object and face This work was supported by the National Research processing using ERPs. Cerebral Cortex, 14, 132-142. Foundation of Korea Grant funded by the Korean Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The Government (NRF- 2014S1A5A2A01016925). fusiform face area: a module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17, 4302-4311. References Luo, W., Feng, W., He, W., Wang, N., & Luo, Y. (2010). Batty, M. & Taylor, M. T. (2003). Early processing of the Three stages of facial expression processing: ERP study six basic facial emotional expressions. Cognitive Brain with rapid serial visual representation. NeuroImage, 49, Research, 17, 613-620. 1857-1867. Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. Pessoa, L., McKenna, M., Gutierrez, E., & Ungerleider, L. (1996). Electrophysiological studies of face perception in G. (2002). Neural processing of emotional faces requires humans. Journal of Cognitive Neuroscience, 8, 551-565. attention. Proceedings of the National Academy of Science USA, 99, 11458-11463. 540 Rossion, B., Joyce, C. A., Cottrell, G. W., & Tarr, M. J. (2003). Early lateralization and orientation tuning for face, word, and object processing in the visual cortex. NeuroImage, 20, 1609-1624. Williams, L. M., Palmer, D., Liddell, B. J., Song, L., & Gordon, E. (2006). The when and where of perceiving signals of threat versus non-threat. NeuroImage, 31, 458- 467. 541