Microsoft Word - BRAIN_2017_vol8_issue2_final2.docx 45 Cursor Movement – a Valuable Indicator in Intelligent System Design Versavia-Maria Ancusa Computer and Information Technology Department, “Politehnica” University of Timisoara, Romania versavia.ancusa@cs.upt.ro Ciprian-Maniu Dragoe Freelancer, Sibiu, Romania ciprian.dragoe@gmail.com Abstract Systems that react to emotional information allow for better satisfaction of the user’s needs, stated or otherwise. Special support should be built-in, in order to read and measure the time- variable user’s affective state. This paper presents how cursor movement can accurately measure two basic emotional states and introduces a way to measure the emotional flow graph of an application, which allows for better user design. Keywords: emotional state, cursor movement, hypnosis, complex network, design 1. Introduction Digital devices are all around us, in an increasingly large number, exploiting a multitude of available data, nonetheless the user’s emotional state when using them is often ignored or considered a dispensable input. However, researchers have postulated for more than 10 years that technology efficiency is strongly interlinked to triggers in the human’s emotional state and their behavior (Tao & Tan, 2005). The whole argument is intrinsically placed in an interdisciplinary setting, linking psychology, medicine, artificial intelligence, education, neuroscience, cognitive science, design and whatsoever engineering fields needed to implement the solution, therefore criss- crossing “technical, material and social systems” (Mazé & Redström, 2008) When creating a new product or service, like for example a learning service, the whole design process is a vehicle to create or satisfy needs (Mazé & Redström, 2008) and there are various branches of Science involved (Figure 1) depending on the desired depth in approaching the human factor. The aforementioned factor can envelop needs, habits, actions and behaviors, all reflected in emotional states (Wood, Quinn, & Kashy, 2002), (Baumeister, DeWall, Vohs, & Alquist, 2009), when using the generically referred product. While there is an obvious “intrinsic and mutual correlation between the product design and user activities” (Lockton, Harrison, & Stanton, The Design with Intent Method: a design tool for influencing user behavior, 2010), the more challenging trait to understand is that emotional information can be reflected in the whole psychological ensemble leading to needs (Esposito, Esposito, & Vogeld, 2015), therefore creating a more consistent work framework and easily understandable homogenous design approach. Although it is considered that design itself influences user behavior by presenting them with an opportunity to play along, there is a trend (Lockton, Harrison, & Stanton, The Design with Intent Method: a design tool for influencing user behavior, 2010) that argues that there are “systems intentionally designed to influence behavior differently from that usually associated with the situation, or in situations where a user would not otherwise have a strong idea of what to do (e.g. with an unfamiliar interface)”. In such systems the designer’s intent interacts at product level with user’s context (experiences, environment, and capabilities) which due to sheer possibility volume may lead to a very varied response. A solution is to target universal experiences (Lockton, Harrison, & Stanton, The Design with Intent Method: a design tool for influencing user behavior, 2010) in order to reduce the response variability, however the data amount available suggests a possible data BRAIN: Broad Research in Artificial Intelligence and Neuroscience Volume 8, Issue 2, July 2017, ISSN 2067-3957 (online), ISSN 2068-0473 (print) 46 mining approach. Emotion data mining emerged only recently (Borras-Morell, 2015), (Reagan, Mitchell, Kiley, Danforth, & Dodds, 2016) and it is still not a product design tactic. Figure 1. Science branches involved in translating product features to human needs Approach notwithstanding, designing product features involves basically a two-step process: (a) detecting and recognizing emotional information and (b) exhibiting a suitable reaction to the previously considered input. Since needs are reflected in the emotional impact, a method to measure the emotions is necessary (Abraham & Michie, 2008) to correctly assess the impact. Referring strictly to a software product, in order to implement the capability to sense the users’ emotional state, the first step should be developing an affective database (Tao & Tan, 2005), to allow correct identification of the affective status. This results in a translation between the affective status of the user and the computer, therefore allowing the program to process the user emotion just like any other input, successfully “digitizing” emotion. Once this milestone is achieved, the research can then focus on developing affective characteristics for the machines, designing evolving algorithms to promote emotional adjustment, etc. In other words, this is the moment in which the computer will exhibit an output, which can be interpreted as an emotional reaction by humans, leading to another emotional reaction from the user and entering a feed-back loop. Humans need multi-modal communication in order to recognize the “honest feeling” (Ekman & Friesen, 1969), therefore the machine generated response must be exhibited on more than one communication channel. The idea of multi-modal communication can also be found in (Lockton, Harrison, & Stanton, The Design with Intent Method: a design tool for influencing user behavior, 2010), although from a design perspective, where an approach consistent of six design lenses presents a particular interest, especially for three of them: (1) the persuasive lens which represents the link to persuasive technology “changing attitudes and so changing behavior through contextual information, advice and guidance”, (2) the visual lens referring to “how users perceive patterns and meanings as they interact with the systems around them, and the use of metaphors” and (3) the cognitive lens or “how people make decisions”. Each of these lenses have different patterns, with each pattern underlined by a pattern behavior, usage outcome or user behavior final state. In (Lockton, Harrison, & Stanton, The Design with Intent Method: a design tool for influencing user behavior, 2010) is suggested that simultaneous application of two or more design patterns to obtain a target behavior is a more advantageous option, like for example the visual and persuasive lenses. From a data mining point of view the idea of patterns creates a different outlook, that it is our opinion should be used in conjunction with the design perspective. V. M. Ancusa, C. M. Dragoe - Cursor Movement – a Valuable Indicator in Intelligent System Design 47 This paper aims to present a method used to read user behavior in a working computer environment, creating reliable means of emotion acquiring and identification and also measures to assess the overall emotional impact and plan for design improvements. 2. Measuring emotions An emotional state in a pure form is something very hard to trigger in a test subject, as often it is mixed with other emotional states. However two affirmations can be made referring to such a state: (1) an emotion is the result of an emotional state, represented a cognitive response of and (2) an emotional state is dependent on the previous emotional state(s) as well as the outside sensory inputs reception by the person (Davidson, et al., 2003). Each emotion can be reduced to a mix of basic emotional states (Ekman & Friesen, 1969), but in order to successfully measure it, we need to determine the type, number, measuring characteristics of the said emotional states. 2.1. Determining the basic emotional states model The concept of basic emotional states, although seems simple, is the subject of a heated debate in the realm of Psychology and consensus cannot be reached regarding to their number, designation and description, all dependent on the model used (Ekman & Friesen, 1969). A very practical solution is to find the most common basic emotional states and use them as baselines, similar to the scheme used in (Hoque & Picard, March 2011) “to simplify the classification and to establish a common benchmark, there has been a trend to use and analyze basic emotional states” For this paper, we chose two basic emotional states: relaxation and stress, found in almost all psychological models. Our choice is further motivated by several factors: (a) the emotional distance between these states, allowing for a good margin of error; (b) importance in the product design process (Michie, van Stralen, & West, 2011); (c) commonality of triggers to induce the states. 2.2. Isolating an emotional state Another milestone in measuring emotions is determining the parameters that trigger an emotional state; we aren't aware of all and often vary from person to person. An example of this problem is putting two individuals in the context of delivering a public speech, one of them might feel comfortable or even happy, while the other might feel stressed. The same stimulus has a different reaction in the individual, making it more difficult to create a controlled experiment that produces the same result in every person. The solution presented in this paper is to use hypnosis. Shrouded by a lot of controversy and uncertainty, hypnosis can be practiced within a rigorous scientific environment. The most wide- spread definition of hypnosis is the one issued by the British Medical Association (British Medical Association, 1955) that describes hypnosis as a “temporary condition of altered perception in the subject which may be induced by another person and in which a variety of phenomena may appear spontaneously or in response to verbal or other stimuli. These phenomena include alterations in consciousness and memory, increased susceptibility to suggestion, and the production in the subject of responses and ideas unfamiliar to him in his normal state of mind. Further phenomena such as anesthesia, paralysis and the rigidity of muscles, and vasomotor changes can be produced and removed in hypnotic state.” According to (Burrows, Stanley, & Bloom, 2001), when a subject is under hypnosis his awareness is fixated on an interior cognitive sentience, which in turn leads to a reduction in the focus on the external sources of information. The central nervous systems’ responses are attenuated as the external environments’ indicators are mainly ignored. BRAIN: Broad Research in Artificial Intelligence and Neuroscience Volume 8, Issue 2, July 2017, ISSN 2067-3957 (online), ISSN 2068-0473 (print) 48 From the previous paragraph the ideas regarding altered perception and narrow focus should be highlighted. If the stimuli that trigger that perception are well applied, the perception of the subject can be oriented in such a matter as to produce an emotional state. That state can be refined to the purest form, because a subject under hypnosis can be made to easily ignore unwanted stimuli, thus obtaining a very high purity of the triggered emotional state. The subject’s own subconscious is responsible for clearing the mind and triggering the emotional state, therefore achieving a personal customization of pure version emotional state. 2.3. Quantifying an emotional state The intensity of an emotional state is best reflected in the way the body reacts (Ekman & Friesen, 1969), as it bypasses the social norms and echoes the mental changes. In Table 1, we present a comparison of existing methods used to measure and quantify emotional states. Table 1. Method comparison Purpose Device Data processing Pitfalls Papers Facial expression recognition video capture device Minimum a 2 step- process: overlay of mesh or muscle model Gabor wavelets, Markov Models Invalid in certain cases: Botox, paralysis, etc. (McKeown, Valstar, Cowie, Pantic, & Schröder, 2012) (Eckhardt & Goodwin, 2012) (McDuff, Kaliouby, & Picard, Nov. 2011) (Hoque, Morency, & Picard, Are you friendly or just polite? - Analysis of smiles in spontaneous face-to-face interactions, Oct. 2011) Gesture processing Video capture and/or motion sensors Cluster recognition, based on built database Long observation time Certain impaired conditions Calibrated on culture specific social norms (Balomenos, et al., 2005) (Baltrusaitis, et al., March 2011) Full-body motion translation Array of motion sensors, specific measuring space Complex geometrical, time variable processing Some parts (hands) can be more controlled than others (feet) Some transitions too fast to be transformed into movement (Ahn, Teeters, Wang, Breazeal, & Picard, Sept 2007) (Gao, Ma, Chen, & Wu, 2005) Emotional speech processing Camera, high quality microphone, synchronization between devices Filtering and segmentation Various types of probability distribution Language specific elements Certain disorders (e.g.: autism) change patterns (Hoque M. E., Analysis of Speech Properties of Neurotypicals and Individuals Diagnosed with Autism and Down Syndrome, Oct. 2008) (Hernandez, Morris, & Picard, Oct. 2011) (Callejas, Griol, & López-Cózar, 2011) (Morris & Picard, May 2011) Whole emotional state Various, synchronized to work together (e.g.: webcam=video +audio; motion sensors, , wearable sensors, cell) Data mining Statistical or geometrical processing Too many parameters (Usually) expensive equipment Very difficult to find a good reference model (Poh, McDuff, & Picard, 2011) (Hussein, Monkaresi, & Calvo, 2012) (Kapoor & Picard, November 2005) V. M. Ancusa, C. M. Dragoe - Cursor Movement – a Valuable Indicator in Intelligent System Design 49 As it can be seen, all these methods generally require extra-hardware as well as moderate to advanced data processing techniques. While indubitably accurate, such methods are not the easy, seamless interaction medium that advanced intelligent emotional reactive systems require, especially in a world in which mobile computing assumes an increasingly larger market share. In today’s world, most of the interaction is “at the click of the mouse”, leading us to the quest to find a way to quantify emotional states with mouse click or screen taps for the mobile users. If proven to work, it would be an extremely interesting way to measure emotions: it is built-in (no extra-hardware needed), it is ubiquitous and quick to process. 3. Experimental results The purpose of our experiments was to determine if we can use the mouse usage patterns as an accurate emotional sensing device for two basic emotional states: relaxation and stress. We selected a lot of 77 volunteers with similar academic backgrounds and comparable computer knowledge. The ratio male/female was almost equal. None of the subjects had any prior knowledge of hypnosis or any prior expectations. The experiments were conducted after the presentation and signing of the informed consent form, scripted based on the Stanford Informed Consent Form (Stanford University). We maintained a constant testing environment and tried to dilute the effects of outside light on the mood by keeping the blinds closed. Hypnosis was used to isolate each of the selected emotional states. The was also conducted according to the Stanford protocol for hypnosis (Stanford University), which is the most referenced protocol and offers the most reproducible results. The hypnosis was induced by C. Dragoe, a fully certified NLP practitioner in order to safely conduct these experiments. Moreover, all experiments were carefully monitored by an external observer (V. Ancusa) and also by filming the subjects, with their preciously given accord and knowledge, in order to show the participants that nothing nefarious was done to them. In order to capture the emotional states, we synchronized the computer program and the users and using only the mouse, in a single modal manner, we captured users’ reactions to the induced states. The course of action adopted for the experiment can be resumed in 4 clear steps. 1. Present the context and scope of the experiment to the test subjects. 2. Use a hypnotic script to elicit the emotional state 3. Use the mouse to interact with the computer and capture this interaction using a dedicated program that simulates a simple, repeated, choice action activity. 4. Use a break state to change the emotional state of the test subjects, as to not to affect their future actions. The application that measures the way the user interacts with the PC mouse simulates a simple strategy game. The goal is to choose between different objects in random pattern a particular one. The learning curve of the game is designed to be very fast in order for the subject to quickly understand all the principles and thus the results collected during the test to not be influenced by accommodation to the mechanics. Each 77 individual response were evaluated using the mouse acceleration (reaction) and the number of mouse clicks as indicators for one’s emotional state. Simply plotting acceleration over time provides little information; reaction seems to be very chaotic, and somewhat similar between the two states (Figure 2). However, when looking at the median values of acceleration and clicks, a clear difference appears between the states: more clicks for relaxation, higher acceleration for stress (Figure 3). BRAIN: Broad Research in Artificial Intelligence and Neuroscience Volume 8, Issue 2, July 2017, ISSN 2067-3957 (online), ISSN 2068-0473 (print) 50 Figure 2. Detailed view of the individual response Figure 3. Average values in acceleration and clicks The experiment shows a fundamental difference in the cognitive mode: relaxation allows more targets to be hit, albeit reaching them slowly, while stress favors reaching the target faster, but not more accurate. Referring to Table 1, we managed to determine whole emotional states, using a single cheap capture device, with minimal data processing that so far can only determine two states. 4. Designing for emotional reactions The whole purpose of the previous test was to determine if we can identify emotional states easily with no added hardware costs and we proved that there is a way to do that. Next, this knowledge will be applied in the software interface design. V. M. Ancusa, C. M. Dragoe - Cursor Movement – a Valuable Indicator in Intelligent System Design 51 Studies from different fields (Arroyo & Wei, 2006), (Mäkiaho & Poranen, 2012), (Lockton, Harrison, Cain, Stanton, & Jennings, 2013), (Seelye, et al., 2015), (Hehman, Stolier, & Freeman, 2014) have shown that mouse trajectory and clicks are environment (application, device) and user specific, especially if mapped over a long working session (Figure 4). Figure 4. Mouse trajectory – images taken from http://iographica.com 4 hours in Photoshop (left) vs. 4 hours in Eclipse (right) Seen from a complex network perspective, these types of traces show a good potential for clustering, which, when combined with clicks and acceleration, should provide a hint on the user’s emotional state while using the program, invaluable for a good interface design. In order to achieve that, we propose the following workflow: Step 1: Measure user interaction with location, time and click details. Record menus and toolbars position and constituents (especially if dynamic). Step 2: Determine the user baseline, maximum and minimum reactions; categorize events, select relevant interface items Step 3: Construct a geographical interaction network based on the previous measurements and compute spatial flow data with hierarchical clustering (using the technique from (Zhu & Guo, 2014)). Color the clusters differently for relaxation and stress. Step 4: Analyze results, for the given environment and user. Step 2 is where the previously presented experiments make a difference. Since we proved that stress and relaxation are opposites, when measured in clicks and acceleration, taking into account that normal users have a mix of these two states, we assume that over a long period of time the mix is even. It is very hard to remain completely relaxed or completely stressed for a long period (more than 2-4 hours) and even if one state takes longer than the other one, since neither are “pure” (no hypnosis used) using average as a baseline makes sense. Worst case scenario would be the baseline to be skewed, but even in that case Step 3 will show the data flow correctly, with a bias on the transitions, still acceptable in order to analyze the workflow. In conclusion, in step 2, we average the clicks and the accelerations to determine the baselines, determine the maximum and minimum values and scale everything in a linear fashion, using stress and relaxation as the two opposites of the scale. Step 3 is the most challenging, from a Computer Science point-of-view as it requires merging of a flow clustering algorithm with the previous nodes classification, as well as image recognition for toolbars items, in order to cut the desired elements. This step currently requires a lot of programmer interaction and needs automation in the future. Moreover, after testing the workflow, we reached the conclusion that dynamic (not always present) toolbars can generate a lot of errors, as sometimes they float and are subsequently recognized as two or more specific clusters, instead of a single cluster. This issue will also need tweaking in the future. BRAIN: Broad Research in Artificial Intelligence and Neuroscience Volume 8, Issue 2, July 2017, ISSN 2067-3957 (online), ISSN 2068-0473 (print) 52 It is worth mentioning that the presented workflow fits the general big data analysis model (Fisher, DeLine, Czerwinski, & Drucke, 2012) as step 1 is data gathering, step 2 – data cleaning, step 3 – processing and 4 – data analysis. To test this strategy, we used it to analyze a designer’s workflow in a graphical editing program for 3 hours. The results are presented in Figure 5. We also recorded the clicks with IOGraph for easier reproduction of the results and that represents the image from the top left corner. The interface of the program is in rightmost upper corner, while the result of Step 3 is offered in the bottom image. Based on this result, the non-automated part, Step 4 begins with analyzing this data and classifying it into stressed/relaxed. The most intense work area was somewhere in the middle of the screen and that cluster also displays the highest stress markers (cluster A). Two other areas of intense work were B and C, which are more relaxed and peripherally displayed. The most used items from the toolbar were selection tool and shape tool, color selection, layer toolbar and align temporary (dynamic) toolbar. From all the user interface items, the one that shows the highest stress markers is the align toolbar. When we asked our subject why he thinks that happened, he recognized that the program he learned graphical editing on had a right side floating toolbar, much easier to use than the one in this program and he always enjoyed that one more. Another interesting fact to analyze is why there are 3 work areas and two show relaxation markers, while the other one does not. The answer is in the nature of the work: graphical design, which accesses the right side of the brain, allows more importance for the peripheral queues while the analytical side favors central images. By-playing the work process, we could see that zones B and C had more creative acts: shape creations, color selection than zone A, which had more analytical acts: aligns, layer cleaning, etc. Figure 5. Emotional flow after 3 hours of graphical editing V. M. Ancusa, C. M. Dragoe - Cursor Movement – a Valuable Indicator in Intelligent System Design 53 To resume, we started with a product interface, found a way to determine two opposite states, than used that way to map user interaction with the product and determine the emotional answer to that interface. The results can then be used to improve product design, to elicit certain emotional responses, etc. For example, in this case, due to the rapid movement (anger) patterns in the aligning phase, a layout that minimizes this can be developed, using a shortcut menu or dynamically appearing guidelines. Furthermore, the user interface can be imagined to be able to learn working patterns and shift the shortcut menu from an aligning menu, if it detects anger patterns, into a color/shape picking menu, if it detects relaxation patterns, therefore assessing the emotional impact and improving the design of the interface in the same time. 5. Conclusion and future work Future work will focus on creating a computer interface to illicit the emotional responses via multi-modal visual-kinesthetic queues and on customizing the isolation of the emotional state, adding more emotional states and comparing the baselines for them. Also interesting would be injecting some diversity (age, backgrounds, maybe even cognitive impaired) in the testing lot, and trying to successfully determine their emotional states. This emotional flow idea can be used by game makers, in order to create a more immerge- able game, even with only the two emotional states evaluated. References Abraham, C., & Michie, S. (2008). A taxonomy of behavior change techniques used in interventions. Categorizing intervention content. Ahn, H., Teeters, A., Wang, A., Breazeal, C., & Picard, R. (Sept 2007). Stoop to Conquer: Posture and affect interact to influence computer users' persistence. 2nd International Conference on Affective Computing and Intelligent Interaction. Lisbon, Portugal. Arroyo, E. S., & Wei, W. (2006). Teaching User Interface Design using a Web-based Usability Tool. Proceedings of the Conference on Human Factors in Computing Systems CHI. Montreal, Canada: ACM. doi:10.1.1.126.5970. Balomenos, T., Raouzaiou, A., loannou, S., Drosopoulos, A., Karpouzis, K., & Kollias, S. (2005). Emotion Analysis in Man-Machine Interaction Systems. In S. Bengio, & H. Bourlard (Ed.), MLMI 2004. LNCS 3361, pp. 318—328. Heidelberg : Springer-Verlag Berlin. Baltrusaitis, T., McDuff, D., Banda, N., Mahmoud, M., Kaliouby, R., Robinson, P., & Picard, R. (March 2011). Real-time inference of mental states from facial expressions and upper body gestures. 9th IEEE International Conference on Automatic Face and Gesture Recognition and Workshops (FG'11). Santa Barbara, CA, USA. Baumeister, R. F., DeWall, C. N., Vohs, K. D., & Alquist, J. L. (2009). Does Emotion Cause Behavior (Apart from Making People Do Stupid, Destructive Things)? In C. R. Agnew, D. E. Carlston, W. G. Graziano, & J. R. Kelly (Eds.), Then A Miracle Occurs: Focusing on Behavior in Social Psychological Theory and Research (p. Chapter 7). New York: Oxford University Press. doi:http://dx.doi.org/10.1093/acprof:oso/9780195377798.003.0007. Borras-Morell, E. J. (2015). Data mining for pulsing the emotion on the web. Methods Mol Biol., 1246, 123-130. doi:10.1007/978-1-4939-1985-7_8. British Medical Association. (1955). Medical use of hypnotism. British Medical Journal, 1. Burrows, G. D., Stanley, R. O., & Bloom, P. B. (Eds.). (2001). International Handbook Of Hypnosis. Sussex, UK: Wiley. doi:10.1002/0470846402. Callejas, Z., Griol, D., & López-Cózar, R. (2011). Predicting user mental states in spoken dialogue systems. EURASIP Journal on Advances in Signal Processing, 6, 1-23. Davidson, K. W., Goldstein, M., Kaplan, R. M., Kaufmann, P. G., Knatterund, G. L., Orleans, C. T.... Whitlock, E. P. (2003). Evidence-based behavioral medicine: What is it and how do we achieve it? Annals of Behavioral Medicine, 26, 161-171. BRAIN: Broad Research in Artificial Intelligence and Neuroscience Volume 8, Issue 2, July 2017, ISSN 2067-3957 (online), ISSN 2068-0473 (print) 54 Eckhardt, M., & Goodwin, M. (2012). Influencing Gaze Behavior and Expression Recognition. Extended Abstract of IMFAR 2012. Toronto, Canada. Ekman, P., & Friesen, W. V. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1, 49-98. Esposito, A., Esposito, A. M., & Vogeld, C. (2015). Needs and challenges in human computer interaction for processing social emotional information. Pattern Recognition Letters, 66(15 November 2015), 41-51. doi:http://dx.doi.org/10.1016/j.patrec.2015.02.013. Fisher, D., DeLine, R., Czerwinski, M., & Drucke, S. (2012, May - June). Interactions with Big Data Analytics. Interactions, 50-59. Gao, Y., Ma, L., Chen, Z., & Wu, X. (2005). Motion Normalization. In J. Tao, T. Tan, & R. Picard (Eds.), ACII 2005, LNCS 3784 (pp. 95-101). Heidelberg: Spring-Verlag, Berlin. Hehman, E., Stolier, R. M., & Freeman, J. B. (2014). Advanced mouse-tracking analytic techniques for enhancing psychological science. Group Processes & Intergroup Relations, 18(3), 384- 401. doi:10.1177/1368430214538325. Hernandez, J., Morris, R., & Picard, R. (Oct. 2011). Call Center Stress Recognition with Person- Specific Models. Affective Computing and Intelligent Interaction. Memphis, USA. Hoque, M. E. (Oct. 2008). Analysis of Speech Properties of Neurotypicals and Individuals Diagnosed with Autism and Down Syndrome. 10th ACM Conference on Computers and Accessibility (ASSETS). Halifax, Canada. Hoque, M. E., Morency, L., & Picard, R. (Oct. 2011). Are you friendly or just polite? - Analysis of smiles in spontaneous face-to-face interactions. Affective Computing and Intelligent Interaction. Memphis, USA. Hoque, M., & Picard, R. (March 2011). Acted vs. natural frustration and delight: Many people smile in natural frustration. 9th IEEE International Conference on Automatic Face and Gesture Recognition (FG'11). Santa Barbara, CA, USA. Hussein, M., Monkaresi, H., & Calvo, R. (2012). Categorical vs. Dimensional Representations in Multimodal Affect Detection during Learning. Intelligent Tutoring Systems. Chania, Greece. Kapoor, A., & Picard, R. (November 2005). Multimodal Affect Recognition in Learning Environments. ACM MM. Singapore. Lockton, D., Harrison, D. J., & Stanton, N. A. (2010, May). The Design with Intent Method: a design tool for influencing user behaviour. Applied Ergonomics, 41(3), 382-392. doi:http://dx.doi.org/10.1016/j.apergo.2009.09.001. Lockton, D., Harrison, D. J., Cain, R., Stanton, N. A., & Jennings, P. (2013). Exploring Problem- framing through Behavioural Heuristics. International Journal of Design, 7(1), 37-53. Mäkiaho, P., & Poranen, T. (2012). Tool Usage in Students’ Software Projects. Tampere, Finland : School of Information Sciences, University of Tampere. Mazé, R., & Redström, J. (2008). Switch! Energy ecologies in everyday life. International Journal of Design, 2(3), 55-70. McDuff, D., Kaliouby, R., & Picard, R. (Nov. 2011). Crowdsourced data collection of facial responses. 13th IEEE International Conference on Multimodal Interaction (ICMI'11). Alicante, Spain. McKeown, G., Valstar, M., Cowie, R., Pantic, M., & Schröder, M. (2012). The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent. IEEE Transactions on Affective Computing, 3(1), 5-17. Michie, S., van Stralen, M. M., & West, R. (2011). The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science, 6(42). Morris, R., & Picard, R. (May 2011). Computer-Mediated Exposure Therapy for Auditory Sensitivity in Autism Spectrum Disorder. Extended Abstract of IMFAR. San Diego, CA, USA. V. M. Ancusa, C. M. Dragoe - Cursor Movement – a Valuable Indicator in Intelligent System Design 55 Poh, M., McDuff, D., & Picard, R. (2011). Advancements in Non-contact, Multiparameter Physiological Measurements Using a Webcam. IEEE Transactions on Biomedical Engineering, 58(1), 7-11. Reagan, A. J., Mitchell, L., Kiley, D., Danforth, C. M., & Dodds, P. S. (2016). The emotional arcs of stories are dominated by six basic shapes. The Computing Research Repository , September. Seelye, A., Hagler, S., Mattek, N., Howieson, D. B., Wild, K., Dodge, H. H., & Kaye, J. A. (2015). Computer mouse movement patterns: A potential marker of mild cognitive impairment. Alzheimers Dement (Amst)., 1(4), 472–480. doi: 10.1016/j.dadm.2015.09.006. Stanford University. (n.d.). Stanford Human Subjects Experiments Guideliness. Retrieved from http://humansubjects.stanford.edu/hrpp/Chapter12.html. Tao, J., & Tan, T. (2005). Affective Computing: A Review. In J. Tao, T. Tan, & R. W. Picard (Eds.), ACII 2005, LNCS 3784 (pp. 981-995). Heidelberg: Springer-Verlag Berlin. Wood, W., Quinn, J. M., & Kashy, D. A. (2002). Habits in Everyday Life: Thought, Emotion, and Action. Journal of Personality and Social Psychology, 83(6), 1281-1297. doi:10.1037//0022- 3514.83.6.1281. Zhu, X., & Guo, D. (2014). Mapping Large Spatial Flow Data with Hierarchical Clustering. Transactions in GIS, 18(3), 421-435. doi:10.1111/tgis.12100. Versavia-Maria Ancusa (b. March 11, 1981) received her BSc in Computer Science (2004), MSc in Advanced Computer Systems (2005), and PhD in Computer Science (2009) from “Politehnica” University of Timisoara. Now she is a Senior Lecturer in Department of Computers and Information Technology, Automation and Computer Faculty, “Politehnica” University of Timisoara. Ciprian-Maniu Dragoe (b. July 14, 1987) received his BSc in Computer Science (2010), MSc in Advanced Computer Systems (2012) from “Politehnica” University of Timisoara. After 3 years working for Continental AEG, he is now a freelance engineer, specialized in psychological Computer Science program development.