Acta IMEKO, Title ACTA IMEKO ISSN: 2221-870X July 2017, Volume 6, Number 2, 93-98 ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 93 Decoding of emotional responses to user-unfriendly computer interfaces via electroencephalography signals Natsue Yoshimura1, 2, 3, Osamu Koga1, Yu Katsui1, Yousuke Ogata1, 2, Hiroyuki Kambara1, Yasuharu Koike1, 2 1Laboratory for Future Interdisciplinary Research of Science and Technology, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Yokohama 226-8503, Japan 2Integrative Brain Imaging Center, National Center of Neurology and Psychiatry, 4-1-1 Ogawa-Higashi, Kodaira, 187-8551 Tokyo, Japan 3National Institute of Neuroscience, National Center of Neurology and Psychiatry, 4-1-1 Ogawa-Higashi, Kodaira, 187-8502 Tokyo, Japan Section: RESEARCH PAPER Keywords: brain-machine interfaces; human-computer interaction; usability; EEG; emotion Citation: Natsue Yoshimura, Osamu Koga, Yu Katsui, Yousuke Ogata, Hiroyuki Kambara, Yasuharu Koike, Decoding of emotional responses to user-unfriendly computer interfaces via electroencephalography signals, Acta IMEKO, vol. 6, no. 2, article 17, July 2017, identifier: IMEKO-ACTA-06 (2017)-02-17 Editor: Paolo Carbone, University of Perugia, Italy Received May 10, 2015; In final form July 4, 2017; Published July 2017 Copyright: © 2017 IMEKO. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited Funding: This work was supported by the Center of Innovation Program from Japan Science and Technology Agency, and JSPS KAKENHI grants 26112004, 15H01659, and 15K01849. Corresponding author: Natsue Yoshimura, e-mail: yoshimura@pi.titech.ac.jp 1. INTRODUCTION In our daily lives, we are surrounded by computers and are sometimes forced to use complicated, user-unfriendly interfaces. These situations may, in some cases, be due to a lack of methods for measuring emotional response to usability. Therefore, there is a real need for establishing an objective and quantitative evaluation method for interfaces. Human activities can be modeled as a decision-making system defined by multiple parameters. A user decides on an action to interact with an interface, such as pressing a button or pulling a lever. This action then results in a change in the interface’s state, such as a change on a screen or an action by a connected machine. This change, in turn, affects parameters for the user, who recognizes the interface change and plans the next action in response. Emotion is one such parameter for humans in this model [1]. Humans typically feel comfortable with an easy-to-use interface but frustrated with an ill-behaving one. If we could objectively and quantitatively evaluate these emotional states in response to an interface, it would be an effective measurement for usability and may yield an adaptive interface that can change its usability depending on the user’s concurrent feeling. ABSTRACT When a user interacts with an interface such as a computer, its effects on specific biosignals may reflect emotional responses to the interface, providing a means to evaluate usability. Towards the development of an interface that can adapt its usability based on the user’s emotions, here we decoded electroencephalography (EEG) activity occurring during interaction with a user-unfriendly interface. Participants performed target-reaching tasks while irregular transformations were applied to cursor motion to induce frustration. Our results showed that differential signals from the frontal electrodes (AF3-AF4) were sufficient to classify between brain activities during transformed (frustrated) and normal cursor motion (non-frustrated). Functional magnetic resonance imaging during the same tasks showed significant activations in the middle frontal gyrus, orbitofrontal gyrus, and inferior parietal cortex, areas found to be related to negative emotions. Altogether, these results suggest that the usability of an interface can be measured from EEG signals, which could aid in the development of adaptive interfaces that increase its intuitiveness. ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 94 Numerous studies have been performed on recognition of emotions from biological signals [2]-[8]. Among these studies, electroencephalography (EEG) has attracted particular attention [9], as it allows for a highly direct measure of emotional state, with the brain controlling the relevant mechanisms for emotion. Li and Lu focused on high gamma band EEG and showed that happiness and sadness can be distinguished with over 90 % accuracy [3]. Asymmetric activity in the frontal region is also known to occur when emotions are evoked [10], and Petrantonakis and Hadjileontiadis utilized this phenomenon to distinguish between six emotional states [5], [6]. Methods such as those applied in previous studies could also be used to detect emotional responses to user interfaces, providing a novel, objective measure of usability. The goal of this paper was to identify a user’s frustration in using a computer interface to objectively evaluate the user- friendliness of the interface. To do so, we first sought to distinguish between states of frustration and calm. We induced frustration by transforming the user inputs to the interface that made the interface more difficult to use. We acquired EEG data while subjects used the normal (i.e. calm) and transformed (frustrated) interfaces, and applied a machine-learning method to classify between the tasks. To confirm that our EEG data were picking up the emotional signals of frustration, we performed a functional magnetic resonance imaging (fMRI) experiment using the same tasks to ensure that brain regions corresponding to negative emotions were elicited. 2. METHODS 2.1. Participants Three healthy individuals (two males and one female, 22-25 years old, right-handed) participated in the EEG experiment. One healthy male (50 years old) participated in the fMRI experiment. Written informed consent was obtained from all the participants prior to the experiment. The experimental protocol was approved by the ethics committee of Tokyo Institute of Technology. 2.2. Experiment tasks A red cursor, controlled with a trackball, was initially presented at the center of a computer screen. Participants were instructed to manipulate the trackball with their right hand to move the cursor to a target designated by a blue circle. (Figure 1). Transformations were applied to the trackball output so that cursor movements differed from the trackball’s movements. We expected this transformation to elicit participants’ frustrated feelings as the interface became more difficult to use. We applied two types of transformations: “Rotation” and “Acceleration.” For Rotation tasks, a rotation around the origin was applied; when the trackball pointer was at (X, Y) and rotation angle was θ, the cursor was shown at (Xcosθ + Ysin θ, Ycosθ − Xsinθ) (Figure 2a). For the Acceleration task, the speed of the cursor increased with distance between the trackball pointer and the origin. When the distance of the trackball pointer was 𝑑 = �𝑥2 + 𝑦2, (1) the cursor was shown at (KdX, KdY), where K is an acceleration coefficient (Figure 2b). We used four transformations for the experiments: Rotation (θ= π/3), Rotation (θ= −π/2), Acceleration (K = 0.02), and Normal (no transformation). We chose these transformations specifically to elicit different emotional states in response to the user-unfriendly interface. As we found that rotation transformations were easy to learn after a few sessions, so we employed two Rotation tasks of different angles to avoid learning effects. 2.3. EEG experiment EEG signals were recorded using 32 active electrodes (g.USBamp and g.LADYbird from g.tec medical engineering, Graz, Austria) from positions according to the extended international 10-20 system (Figure 3). Sampling frequency was 256 Hz and a 50 Hz notch filter was applied during recording. Figure 1. Positions of the 8 reaching targets (a) and the three periods in a trial, Reaching task period (b), “Still” period (c), and Blinking period (d). Figure 2. Axis transformation between mouse pointer and cursor. (a) “Rotation” transformation, (b) “Acceleration” transformation. Figure 3. EEG electrode positions. ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 95 Five sessions were performed for each transformation, resulting in 20 sessions per participant. Each session consisted of eight trials for the eight target positions in random order (Figure 1a). The cursor was initially placed at the origin (a red point in Figure 1b). After each reaching task was completed, a 2-second “Still” period immediately followed (Figure 1c). EEG signals during the Still period were used to decode their feelings elicited by the task to avoid the effects of motion intention. We expected that task-related emotional effects would remain during the period. During the Still period, participants were instructed to gaze at the same position until the reached target disappeared. Then a 3-second “Blink” period followed (Difure 1d), during which participants were allowed to blink freely. We set the Blink period to prevent eye-blinking signals from contaminating the brain activity signals during the task period. 2.4. fMRI experiment A 3.0-T Signa scanner equipped with a 9-channel array coil (General Electric Company, Fairfield, Connecticut) was used for the fMRI experiment. Functional data were acquired with an echo planar imaging sequence. A functional run consisted of 16 trials for the Normal condition, 8 trials for each of the two Rotation conditions, and 8 trials for the Acceleration condition. We acquired twice the number of Normal trials because the participant could perform the reaching task much faster during the Normal condition than during other conditions. Five runs were performed. During a trial, a target position was randomly chosen from the eight points (Figure 1a), and all eight points were chosen twice (for the Normal condition) or once (for the other conditions) per run. The cursor was initially placed at the origin (Figure 1b). After each reaching task was completed, a Still (rest) period immediately followed (Figure 1c). The period for the rest was randomly chosen from 1, 2, or 3 s. Since eye movements and blinking were not restricted in the fMRI experiment, trials did not have a Blink period. 2.5. EEG signal processing EEG signals during the 2-s Still period were used for further analysis. We used a feature extraction method based on [11]. A 256-point FFT was applied to the EEG signals using a 250-ms window and a 125-ms window overlap. We then used the first 40 bins (1-40 Hz) of the mean spectrum as features for classification. This feature extraction was performed for signals from single electrodes as well as differential signals between neighbouring electrode pairs. 2.6. Classification analysis using EEG signals We conducted a classification analysis to determine if emotional responses to the user-unfriendly interface could be identified from the EEG signals. Data for each trial were labelled according to their axis transformations, and the labels were predicted using a machine learning technique. Since there were two types of Rotation trials, we mixed those trials and randomly eliminated half of them. Using a support vector machine [12], 10-fold cross validation was performed to calculate classification accuracy. Specifically, with 40 trials acquired for each condition, 36 trials 2 conditions (Normal vs. Rotation, or Normal vs. Acceleration) were used for training, and 4 trials 2 conditions were used for verification. Following previous studies on negative emotions [10], [13]-[15], we used only frontal and parietal electrodes for classification. Net accuracy was calculated by averaging classification accuracy across iterations. Welch’s t-test was conducted on the results to test significance. 2.7. fMRI image processing and general linear model analysis fMRI data were pre-processed and analyzed using Statistical Parametric Mapping software, SPM12 (Wellcome Department of Cognitive Neurology, UK; available at http://www.fil.ion.ucl.ac.uk/spm). Pre-processing steps included slice-timing correction, motion correction, coregistration of the functional and anatomical images, normalization into standard Montreal Neurological Institute (MNI) space, and smoothing using an 8-mm Gaussian kernel. General linear model analysis was performed for three contrasts: (Rotations and Acceleration) > Normal, Rotations > Normal, and Acceleration > Normal. 3. RESULTS 3.1. EEG Classification results Classification accuracies for Normal vs. Acceleration are shown in Figures 4a and 4b. For all participants, accuracies using the differential signal for parietal electrodes (C3 (minus) Cz) were significantly higher than chance level of 50 %. Feature vectors using (C3 Cz) worked well for all participants (Participant 1: 76.3 % (p = 2.97 10-4), Participant 2: 86.3 % (p = 3.50 10-4 ), Participant 3: 80.0 % (p = 1.01 10-4). The differential signal for frontal electrodes (AF3 AF4) also showed significantly higher accuracy than chance level for all participants (Participant 1: 75.0 % (p = 4.39 10--5 ), Participant 2: 77.5 % (p = 8.66 10--5), Participant 3: 65.0 % (p = 0.0330)). Figure 4. Classification accuracies using EEG signals. (a) Binary classification results comparing Acceleration and Normal conditions using differential signal C3 – Cz; (b) Binary classification results comparing Acceleration and Normal conditions using differential signal AF3 – AF4; (c) Binary classification results comparing Rotation and Normal conditions using differential signal C3 – Cz; (d) Binary classification results comparing Rotation and Normal conditions using differential signal AF3 – AF4. *p < 0.05, **p < 0.01, ***p < 0.001 by t-test. ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 96 Classification accuracies for Normal vs. Rotation are shown in Figures 4c and 4d). Participant 2 did not show significance for either differential signal (C3 Cz: 61.3 % (p = 0.0967), AF3 AF4: 50.0 % (p = 0.50 )). Other participants also showed low classification accuracies, except participant 3 for AF3 AF4 (75.0 %, p = 1.96 10-4 ). 3.2. fMRI activation areas Figure 5a shows three significant activation areas that were obtained by the contrast (Rotations and Acceleration) > Normal (p < 0.001, uncorrected for multiple comparisons). We expected the contrast to show the net effect of transformed cursor movement on brain activity. The largest cluster was located in the left middle frontal gyrus (MFG) (MNI coordinates: [-48, 44, 26], T=3.61), and the second largest cluster was in the right lateral orbitofrontal cortex (LOFC) (MNI coordinates: [36, 54, -10], T=3.53). The left inferior parietal lobe (IPL) also showed a small activated-cluster (MNI coordinates: [-52, -36, 58], T=3.13). Looking at the Rotations > Normal and Acceleration > Normal contrasts (Figures 5b and 5c), the Rotation condition showed activated areas in the left IPL (MNI coordinates: [-46, 38, 30], T=4.02, p < 0.001, uncorrected) and right MFG (MNI coordinates: [28, 6, 56], T=3.24, p < 0.001, uncorrected). The Acceleration condition showed many activated areas, and the largest cluster was located in the left and right IPLs (MNI coordinates: left, [-62, -36, 46], T=4.63, p < 0.05, family-wise error corrected; right, [68, -24, 34], T=4.42, p < 0.001, uncorrected). The right LOFC was also the third largest activated area (MNI coordinates: [36, 54, -10], T=4.15, p < 0.001, uncorrected). 4. DISCUSSION Our results showed that significantly high classification accuracies especially in Acceleration vs. Normal classification Figure 5. Activation areas in general linear model analysis for contrasts (Rotations + Acceleration) > Normal (a), Rotation > Normal (b), and Acceleration > Normal (c). All activated areas shown are significant at a threshold of p<0.001, uncorrected for multiple comparisons. ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 97 were obtained for all participants using only two EEG electrode signals, suggesting that brain activity varied enough across cursor movement transformations to be discriminated. Since the EEG data were not contaminated by electrooculogram or motion artifacts, and all eight target positions were mixed in the classification analysis, the variance in brain activity can be interpreted as a reflection of activity elicited by the cursor transformation. 4.1. Difference between Rotation and Acceleration As shown in Figure 4, classification performance for Rotation (c and d) was lower than that for Acceleration (a and b), though the participants reported that Rotation was more frustrating than Acceleration. We think this is due to the timing of the occurrence of the frustration. Rotation provided difficulty at the initiation of movement, while Acceleration posed difficulty towards the end of the task period because it was hard to stop the cursor exactly on the target. We used EEG data from the time period immediately after the reaching tasks were completed to evaluate frustration, to ensure that the data did not contain other effects such as motion and movement directions. Thus, it is possible that the time period used included more frustration elicited by Acceleration than by Rotation. 4.2. Validity of the electrode positions providing high accuracies Under the Pleasure-Arousal-Dominance (PAD) model [16] and its two-dimensional variation [17], [18], which employs arousal (intense calm) and valence (positive negative) as its orthogonal axes, emotions related to user-unfriendliness of interfaces could be considered as high arousal and low valence. In Acceleration vs. Normal (Figure 4a), C3 Cz showed high accuracies for all participants. This is comparable with results from previous studies that showed a relation between unpleasant image stimulation and parietal regions [14], [15]. Electrodes in frontal regions, AF3 AF4, were also found to be effective for classification (Fig. 4b). Kostyunina and Kulikov [13] reported that alpha wave power significantly increased at F3, T4, and O1 when feeling anger, which is considered a stronger emotion than annoyance or irritation [19]. Disgust, which also has high arousal and low valence, also can be distinguished from a calm state using right frontal electrodes [10]. That may be the reason why AF3 AF4 showed relatively high accuracy in our results; the differential signal enhanced activity difference between the right and left hemispheres. 4.3. Comparison of fMRI and EEG results Although fMRI was performed for only one participant, we believe the data, as it compares with existing findings, does provide some insight into the emotional responses elicited by the tasks. Even if we could not obtain high classification accuracies in Rotation vs. Normal condition due to the time lag between task period and rest period (i.e. data used for analysis), fMRI analysis revealed if negative emotions were elicited by both of the cursor transformations. Our fMRI analysis showed activation mainly in the left and right MFGs, right lateral orbitofrontal gyrus, and left and right IPLs. The left and right MFGs, that were mainly activated in the Rotation condition, are associated with frustration [8] as well as negative feelings of sadness and anger [7]. The right LOFC is associated with negative feelings elicited by angry face observation [20] and stress induction [21] as well as regulation of negative emotions [22]. The IPL relates to various cognitive functions, including attention [23], language [24], action processing, and emotional action observation [25]-[28]. Considering our experimental tasks, emotional action observation is unlikely to have been the cause. Rather, the IPL activation might have been due to reorienting and spatial attention [23], [29], which are motor planning and action-related functions [30]. Controlling the cursor under transformed motion required motor planning for the entire duration of the task. Furthermore, IPL activation was lower in the merged contrast (Rotation and Acceleration) > Normal than the other contrasts. This may have occurred because the coordinates of the activated voxels in IPL were not exactly the same between Acceleration and Rotation, suggesting different strategies were employed for the reorienting or motor planning. Comparing the relevant activation areas in fMRI with the EEG electrodes positions used for classification, differential signal AF3-AF4 may have included brain activity around the right and left MFG and right LOFC, and C3-Cz may have included activity around the right and left IPL. To prioritize activity related to emotional response, it might be better to focus on AF3-AF4 differential activity in future work. Further investigation with an increased sample size and EEG source localization analysis is also needed before more definitive interpretations can be made. 5. CONCLUSIONS In this study, we used EEG to evaluate the usability of a human interface. We employed target-reaching tasks with transformations applied to cursor motion to introduce user- unfriendliness. FFT-based feature extraction and support vector machine classification revealed that two electrode signals from frontal regions were sufficiently effective in discriminating between user-friendly and user-unfriendly conditions. An fMRI experiment using the same tasks further revealed activation in the left MFG and right LOFC, which were previously reported as areas relevant to negative emotions. These results support our future plans to develop an interface with the ability to adapt its usability based on the user’s emotional response. REFERENCES [1] P. Rani, N. Sarkar, C.A. Smith, and J.A. Adams: ‘Affective communication for implicit human-machine interaction’, 2003 IEEE International Conference on Systems, Man and Cybernetics, Vols 1-5, Conference Proceedings, (2003) pp. 4896- 4903. [2] S. Jerritta, M. Murugappan, R. Nagarajan, and K. Wan: ‘Phyiological signals based human emotion recognition: a review’, 2011 IEEE 7th Internationla colloquium on Signal Processing and its Applications, IEEE, (2011) pp. 410-415 [3] M. Li, and B.L. Lu: ‘Emotion Classification Based on Gamma- band EEG’, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vols 1-20, (2009) pp. 1323-1326. [4] C.M. Pawliczek, B. Derntl, T. Kellermann, R.C. Gur, F. Schneider, and U. Habel: ‘Anger under control: neural correlates of frustration as a function of trait aggression’, PloS one, 8, (10), (2013) pp. e78503. [5] P.C. Petrantonakis, and L.J. Hadjileontiadis: ‘A Novel Emotion Elicitation Index Using Frontal Brain Asymmetry for Enhanced EEG-Based Emotion Recognition’, IEEE T Inf Technol B, 15, (5), (2011) pp. 737-746. [6] P.C. Petrantonakis, and L.J. Hadjileontiadis: ‘Adaptive Emotional Information Retrieval From EEG Signals in the Time-Frequency Domain’, IEEE T Signal Proces, 60, (5), (2012) pp. 2604-2616. ACTA IMEKO | www.imeko.org July 2014 | Volume 6 | Number 2 | 98 [7] K. Vytal, and S. Hamann: ‘Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis’, J Cogn Neurosci, 22, (12), (2010) pp. 2864-2885. [8] R. Yu, D. Mobbs, B. Seymour, J.B. Rowe, and A.J. Calder: ‘The neural signature of escalating frustration in humans’, Cortex, 54, (2014) pp. 165-178. [9] S.G. Mason, A. Bashashati, M. Fatourechi, K.F. Navarro, and G.E. Birch: ‘A comprehensive survey of brain interface technology designs’, Ann Biomed Eng, 35, (2), (2007) pp. 137- 169. [10] R.J. Davidson, C.D. Saron, J.A. Senulis, P. Ekman, and W.V. Friesen: ‘Approach Withdrawal and Cerebral Asymmetry - Emotional Expression and Brain Physiology .1.’, J Pers Soc Psychol, 58, (2), (1990) pp. 330-341. [11] O. AlZoubi, R.A. Calvo, and R.H. Stevens: ‘Classification of EEG for Affect Recognition: An Adaptive Approach’, Lect Notes Artif Int, 5866, (2009) pp. 52-61. [12] C.C. Chang, and C.J. Lin: ‘LIBSVM: A Library for Support Vector Machines’, Acm T Intel Syst Tec, 2, (3), (2011). [13] B. Guntekin, and E. Basar: ‘Event-related beta oscillations are affected by emotional eliciting stimuli’, Neurosci Lett, 483, (3), (2010) pp. 173-178. [14] M.B. Kostyunina, and M.A. Kulikov: ‘Frequency characteristics of EEG spectra in the emotions’, Neurosci Behav Physiol, 26, (4), (1996) pp. 340-343. [15] N. Martini, D. Menicucci, L. Sebastiani, R. Bedini, A. Pingitore, N. Vanello, M. Milanesi, L. Landini, and A. Gemignani: ‘The dynamics of EEG gamma responses to unpleasant visual stimuli: From local activity to functional connectivity’, NeuroImage, 60, (2), (2012) pp. 922-932. [16] A. Mehrabian: ‘Pleasure arousal dominance: A general framework for describing and measuring individual differences in temperament’, Curr Psychol, 14, (4), (1996) pp. 261-292. [17] T. Eerola, and J.K. Vuoskoski: ‘A comparison of the discrete and dimensional models of emotion in music’, Psychology of Music, 39, (1), (2010) pp. 18-49. [18] J.A. Russell: ‘Affective Space Is Bipolar’, J Pers Soc Psychol, 37, (3), (1979) pp. 345-356. [19] J.R. Averill: ‘Studies on Anger and Aggression - Implications for Theories of Emotion’, Am Psychol, 38, (11), (1983) pp. 1145- 1160. [20] R. Elliott, R.J. Dolan, and C.D. Frith: ‘Dissociable functions in the medial and lateral orbitofrontal cortex: evidence from human neuroimaging studies’, Cereb Cortex, 10, (3), (2000) pp. 308-317. [21] N.Y. Oei, I.M. Veer, O.T. Wolf, P. Spinhoven, S.A. Rombouts, and B.M. Elzinga: ‘Stress shifts brain activation towards ventral 'affective' areas during emotional distraction’, Soc Cogn Affect Neurosci, 7, (4), (2012) pp. 403-412. [22] A. Golkar, T.B. Lonsdorf, A. Olsson, K.M. Lindstrom, J. Berrebi, P. Fransson, M. Schalling, M. Ingvar, and A. Ohman: ‘Distinct contributions of the dorsolateral prefrontal and orbitofrontal cortex during emotion regulation’, PloS one, 7, (11), (2012) pp. e48107. [23] G.R. Fink, J.C. Marshall, P.H. Weiss, and K. Zilles: ‘The neural basis of vertical and horizontal line bisection judgments: an fMRI study of normal volunteers’, NeuroImage, 14, (1 Pt 2), (2001) pp. S59-67. [24] M. Vigneau, V. Beaucousin, P.Y. Herve, H. Duffau, F. Crivello, O. Houde, B. Mazoyer, and N. Tzourio-Mazoyer: ‘Meta- analyzing left hemisphere language areas: phonology, semantics, and sentence processing’, NeuroImage, 30, (4), (2006) pp. 1414- 1432. [25] B. de Gelder, J. Snyder, D. Greve, G. Gerard, and N. Hadjikhani: ‘Fear fosters flight: a mechanism for fear contagion when perceiving emotion expressed by a whole body’, Proc Natl Acad Sci U S A, 101, (47), (2004) pp. 16701-16706. [26] H. Goldberg, A. Christensen, T. Flash, M.A. Giese, and R. Malach: ‘Brain activity correlates with emotional perception induced by dynamic avatars’, NeuroImage, 122, (2015) pp. 306- 317. [27] J. Grezes, S. Pichon, and B. de Gelder: ‘Perceiving fear in dynamic body expressions’, NeuroImage, 35, (2), (2007) pp. 959- 967. [28] S. Pichon, B. de Gelder, and J. Grezes: ‘Emotional modulation of visual and motor areas by dynamic body expressions of anger’, Soc Neurosci, 3, (3-4), (2008) pp. 199-212. [29] M. Corbetta, G. Patel, and G.L. Shulman: ‘The reorienting system of the human brain: from environment to theory of mind’, Neuron, 58, (3), (2008) pp. 306-324. [30] S. Caspers, K. Zilles, A.R. Laird, and S.B. Eickhoff: ‘ALE meta- analysis of action observation and imitation in the human brain’, NeuroImage, 50, (3), (2010) pp. 1148-1167. Decoding of emotional responses to user-unfriendly computer interfaces via electroencephalography signals